US20010056443A1 - Apparatus and method for presenting navigation information based on instructions described in a script - Google Patents

Apparatus and method for presenting navigation information based on instructions described in a script Download PDF

Info

Publication number
US20010056443A1
US20010056443A1 US09/392,221 US39222199A US2001056443A1 US 20010056443 A1 US20010056443 A1 US 20010056443A1 US 39222199 A US39222199 A US 39222199A US 2001056443 A1 US2001056443 A1 US 2001056443A1
Authority
US
United States
Prior art keywords
information
navigation
point
time
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/392,221
Other versions
US6336072B1 (en
Inventor
Kuniharu Takayama
Minoru Sekiguchi
Hirohisa Naito
Hisayuki Horai
Yoshiharu Maeda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Assigned to FUJITSU LIMITED reassignment FUJITSU LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HORAI, HISAYUKI, MAEDA, YOSHIHARU, NAITO, HIROHISA, SEKIGUCHI, MINORU, TAKAYAMA, KUNIHARU
Priority to US09/991,921 priority Critical patent/US6748316B2/en
Priority to US09/994,003 priority patent/US6697731B2/en
Publication of US20010056443A1 publication Critical patent/US20010056443A1/en
Application granted granted Critical
Publication of US6336072B1 publication Critical patent/US6336072B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3644Landmark guidance, e.g. using POIs or conspicuous other objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3655Timing of guidance instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096811Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed offboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • G08G1/096844Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the complete route is dynamically recomputed based on new data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • G08G1/096861Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where the immediate route instructions are output to the driver, e.g. arrow signs for next turn
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • G08G1/096872Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where instructions are given per voice
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096877Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement
    • G08G1/096883Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement where input information is obtained using a mobile device, e.g. a mobile phone, a PDA
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096877Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement
    • G08G1/096888Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement where input information is obtained using learning systems, e.g. history databases

Definitions

  • the present invention relates to a navigation information presenting apparatus for providing navigation information of a route, etc. by using a markup language description and a method thereof, and more particularly to a technique which is applied to car navigation systems, personal computers, PDA (Personal Digital Assistant), PDC (Personal Digital Cellular), PHS (Personal Handyphone System), etc., and is available for providing route information or additional information such as route navigation, sightseeing information, a delivery plan, a travel plan, traffic control, scheduling, an amusement, a municipal service, etc. via a network or an electronic medium.
  • PDA Personal Digital Assistant
  • PDC Personal Digital Cellular
  • PHS Personal Handyphone System
  • An object of the present invention is to provide an apparatus which can provide the information such as points, routes, and facilities, can actually or virtually perform navigation along a route to a certain point as a user moves or time elapses, and can provide various information for guidance in a format which can be shared by various systems or devices, and a method thereof.
  • a navigation information presenting apparatus in a first aspect of the present invention comprises an inputting unit, a state acquiring and/or generating unit, a processing unit, and a presenting unit, and presents navigation information to a user or users depending on the state.
  • the inputting unit inputs a navigation script composed of an instruction sequence based on a predetermined specification, which can describe at least time information and/or point information, and various information for guidance to be output according to at least a time and/or point to be presented, where each of the information is described by a set of combinations of a name which can identify a type of the information and the contents thereof.
  • the state acquiring unit acquires a state including at least a current time and point, and the state generating unit generates a state including at least either of a virtual current time and point.
  • the processing unit processes the instructions described in an input navigation script according to at least the current or virtual time and point obtained by the state acquisition process or the state generation process.
  • the presenting unit outputs navigation information to be output as the instructions are processed, and presents the navigation information to a user or users.
  • a navigation information presenting apparatus in a second aspect of the present invention comprises a selecting unit and an outputting unit, and presents navigation information to a user or users depending on a state.
  • the selecting unit dynamically selects navigation information to be presented according to at least time information and/or point information.
  • the outputting unit outputs selected navigation information according to at least a time and a point to be presented.
  • a driving managing device in a third aspect of the present invention comprises an inputting unit, a driving management database, a coordinating unit, and an outputting unit, and manages driving data.
  • the inputting unit inputs a navigation script composed of an instruction sequence based on a predetermined specification, which can describe at least time information and/or point information, and information for guidance to be output according to at least a time and/or point to be presented, where the information is described by a set of combinations of a name which can identify a type of the information and the contents thereof.
  • the driving management database manages the data describing at least time and/or point information, and at least a reservation state and/or a point state.
  • the coordinating unit makes a comparison and coordination between the navigation script input by the inputting unit and the data of the driving management database, and performs the process for modifying the navigation script and the process for updating the data of the driving management database depending on need, according to the result of the comparison and coordination.
  • the outputting unit outputs a resultant navigation script.
  • a time coordinating device in a fourth aspect of the present invention comprises an inputting unit, a scheduler, a rule base, and a monitoring and executing device, and proposes an action to be executed by a user depending on whether or not the user reaches by an arrival time.
  • the inputting unit inputs a navigation script composed of an instruction sequence based on a predetermined specification, which can describe at least time information and/or point information, and information for guidance to be output according to at least a time and/or point to be presented, where the information is described by a set of combinations of a name which can identify a type of the information and the contents thereof.
  • the scheduler schedules arrival times at respective points.
  • the rule base stores the rules which describe actions to be executed depending on whether or not there is sufficient time to an arrival time.
  • the monitoring/executing device checks the arrival times at which subsequent points from the A current point at the current time are reached for at least each of a predetermined time, point, and distance.
  • a navigation plan creating apparatus in a fifth aspect of the present invention comprises an associating unit, a setting unit, and a creating unit, and creates a navigation plan obtained by combining navigation information.
  • the associating unit associates the navigation information with one of areas and points of map information.
  • the setting unit sets a route specified on the map information.
  • the creating unit creates a navigation plan by extracting the navigation information associated with the set route.
  • a navigation information providing apparatus in a sixth aspect of the present invention comprises a managing unit, a retrieving unit, and a providing unit, and provides information to a user.
  • the managing unit manages the information with a presentation condition relating to a time.
  • the retrieving unit checks the information with the presentation condition for each time step, and retrieves the information which satisfies a time condition.
  • the providing unit provides a user with the information which satisfies the time condition.
  • a navigation information providing apparatus in a seventh aspect of the present invention comprises a managing unit, an obtaining unit, a retrieving unit, and a providing unit, and provides a user with information.
  • the managing unit manages the information with a presentation condition relating a place.
  • the obtaining unit obtains the position information of a user.
  • the retrieving unit checks the information with the presentation condition according to the obtained position of the user, and retrieves information which satisfies a place condition.
  • the providing unit provides the user with the information which satisfies the place condition.
  • a navigation information providing apparatus in an eighth aspect of the present invention comprises a managing unit, an obtaining unit, a retrieving unit, and a providing unit, and provides a user with information.
  • the managing unit manages the information with a presentation condition relating to a place.
  • the obtaining unit obtains the position information of a user.
  • the retrieving unit checks the information with the presentation condition, and retrieves a user which satisfies a place condition.
  • the providing unit provides the retrieved user with the information with the corresponding presentation condition.
  • FIG. 1 is a block diagram showing the configuration of an apparatus according to the present invention
  • FIG. 2 explains the process performed by a script editing unit
  • FIG. 3 shows a part of structured navigation data into which a navigation script is converted, in the form of a table
  • FIG. 4 shows in the form of a table a part of structured navigation data into which a naviscript is converted, in the form of a table
  • FIG. 5 is a flowchart showing the process performed by an operation inputting unit
  • FIG. 6 is a flowchart showing the process performed by a script converting unit
  • FIG. 7 is a flowchart showing the preparing process performed by an instruction processing unit
  • FIG. 8 is a flowchart showing the execution process performed by the instruction processing unit
  • FIG. 9 is a flowchart showing the state acquiring process performed by a state acquiring unit
  • FIG. 10 is a flowchart showing the information acquiring process performed by the state acquiring unit
  • FIG. 11 is a flowchart showing the state preparing process performed by a state generating unit
  • FIG. 12 is a flowchart showing the state generating process performed by the state generating unit
  • FIG. 13 is a flowchart showing the navigation outputting process performed by a navigation outputting unit
  • FIG. 14A shows a target route of a script semi-automatic generation process
  • FIG. 14B shows target time series data of the script semi-automatic generation process
  • FIG. 15 exemplifies the system configuration in the case where the present invention is applied to a portable personal computer
  • FIG. 16 exemplifies a menu screen for retrieving a naviscript
  • FIG. 17 exemplifies a screen resultant from the naviscript retrieval
  • FIG. 18 exemplifies a screen on which navigation and operations are performed based on a naviscript
  • FIG. 19 exemplifies the system configuration in the case where the present invention is applied to a car navigation system
  • FIG. 20 exemplifies the system configuration in the case where the present invention is applied to a PHS
  • FIG. 21 exemplifies the system configuration in the case where the present invention is applied to a driving managing system
  • FIG. 22 exemplifies a naviscript editor screen displayed by a terminal
  • FIG. 23 exemplifies a naviscript browser screen displayed by the terminal
  • FIG. 24 is a flowchart showing the process performed by the terminal
  • FIG. 25 is a flowchart showing the process performed by a driving managing center
  • FIG. 26 is a flowchart showing the comparison/coordination process performed by the driving managing center
  • FIG. 27 exemplifies the system configuration in the case where the present invention is applied to a time coordinating system during a move
  • FIG. 28 exemplifies a monitor display screen
  • FIG. 29 is a flowchart showing the process performed by a scheduler
  • FIG. 30 is a flowchart showing the process performed by a monitor
  • FIG. 31 exemplifies the configuration of a navigation plan creating and information for guidance managing system
  • FIG. 32 is a flowchart showing the process for creating a navigation plan
  • FIG. 33A shows the process for attaching navigation information to map data
  • FIG. 33B shows the process for attaching navigation information to time data
  • FIG. 34 exemplifies a navigation information setting screen
  • FIG. 35 exemplifies the process for attaching navigation information
  • FIG. 36A shows a first example of navigation sheets
  • FIG. 36B shows a second example of a navigation sheet
  • FIG. 37 exemplifies navigation information display
  • FIG. 38A shows a first example of a route
  • FIG. 38B shows a second example of the route
  • FIG. 38C shows a first expanded route
  • FIG. 38D shows a second expanded route
  • FIG. 38E shows a point database and a time database
  • FIG. 39 exemplifies map data for route navigation
  • FIG. 40 exemplifies the configuration of a system for processing information with a time/point condition
  • FIG. 41A shows a first example of an information presentation screen
  • FIG. 41B shows a second example of the information presentation screen
  • FIG. 41C shows a third example of the information presentation screen
  • FIG. 42 exemplifies the system configuration in the case where a server side processes information with a time condition
  • FIG. 43 exemplifies the system configuration in the case where a terminal side processes information with a time condition
  • FIG. 44 is a flowchart showing the process for manipulating information with a time condition
  • FIG. 45 exemplifies the system configuration in the case where a server side processes information with a place condition
  • FIG. 46 is a flowchart showing the process for manipulating the information with the place condition
  • FIG. 47 exemplifies the system configuration in the case where a terminal side processes information with a place condition
  • FIG. 48 is a flowchart showing the process for manipulating information with a time/point condition
  • FIG. 49 is a flowchart showing the process for manipulating information with a time/point condition performed on the terminal side
  • FIG. 50 exemplifies the system configuration in the case where a terminal having a scheduling capability processes information with a condition
  • FIG. 51 is a flowchart showing the process for manipulating information with a time/point condition performed on the terminal side having the scheduling capability.
  • voice or image warning is issued by an instruction which is described in the navigation script beforehand.
  • This navigation script can be used also for a car navigation system having the capabilities of the present invention. Therefore, the navigation according to the navigation script created by the friend can be received, also when the user visits the friend's home by car.
  • a navigation script for navigating Shibuya Station for a couple of hours is downloaded from the center providing navigation information via a network.
  • Instructions described in the navigation script are executed by a portable information device, so that a navigation service according to a time and a place can be received. Also restaurant information is automatically displayed around lunch time.
  • the navigation service can be also received by a cellular phone, etc.
  • the instructions included in the navigation script are executed on a center side, and a center device transmits the navigation information to the cellular phone as voice or text.
  • navigation scripts of recommended sightseeing courses are created and registered to an electronic medium such as a CD-ROM (Compact Disk-Read Only Memory) being attached to a travel magazine as a supplement, a bar-code, etc.
  • a subscriber retrieves his or her desired sightseeing course from the electronic medium with a PC, etc., and executes the instructions included in the retrieved navigation script in a simulation mode.
  • the navigation information are dynamically displayed as if the subscriber actually walked along the sightseeing course.
  • the navigation script in a navigation mode on the actual course, the information for guidance according to the point where the subscriber actually stays can be viewed.
  • a navigation information presenting apparatus comprises: an inputting unit for inputting a navigation script describing an instruction sequence which can describe at least time and/or point information, and information for guidance to be output according to a time and/or point to be presented, based on a predetermined specification; a unit for acquiring the state of a current time and point, or for generating the state of a virtual current time and point; a unit for processing instructions described in the input navigation script according to the current time and point obtained by the state acquisition or generation process; and a unit for outputting the navigation information to be output while the instructions are processed, and for presenting the navigation information to a user or users.
  • the navigation script is described, for example, in a markup language which identifies with tags the time information, the point information, the information for guidance, and the other constituent elements of instructions.
  • the navigation script can describe a directive for directing a plurality of instructions to be processed sequentially or in parallel.
  • the unit for processing instructions processes the plurality of instructions sequentially or in parallel according to the above described directive for sequential/parallel instruction processing.
  • the unit for inputting a navigation script inputs a navigation script specified by a user or users by communicating with an external device which provides the navigation script via a network, and/or by reading the navigation script from a computer-readable electronic medium, and/or from an input device operated by a user or users.
  • the navigation information presenting apparatus further comprises a unit for parsing an input navigation script and for converting the script into hierarchical and grouped navigation data.
  • the unit for processing instructions processes the instructions represented as structured navigation data.
  • the unit for outputting navigation information presents to a user or users a part or the whole of a navigation script, such as a current point, a departure point, en-route spots, a destination, and a route one after another or according to each instruction. Additionally, the navigation information is presented to the user or users as texts, maps, voice, images, videos, lights, smell, force, movements, etc. for a specified time, point, distance, input operation, and/or an external event.
  • a navigation mode or a simulation mode can be selected. Instructions are processed according to the state of an actual current time and/or point in the navigation mode, while the instructions are processed according to the state of a virtual current time and/or point in the simulation mode. As a result, navigation information is presented to a user or users.
  • a program for realizing the above described units by means of a computer can be stored in a computer-readable portable medium memory or a suitable storage medium such as a semiconductor memory, a hard disk, etc.
  • a navigation script can be stored in a computer-readable portable medium memory such as a magnetic disk, an optical disk, an IC card, etc., or a suitable storage medium such as a semiconductor memory, a hard disk, etc.
  • the navigation script may be converted into a bar code, which can be registered to a printed matter.
  • the navigation script can be created and edited by a normal text editor or a GUI (Graphical User Interface) editor. Or, the script can be semi-automatically generated based on a history of time and point information, which is obtained while moving on a route to be actually navigated.
  • GUI Graphic User Interface
  • the above described navigation script has a feature that an instruction sequence relating to time, point, navigation information is described in a markup language based on a predetermined specification, and is easy to be read and written by human beings. Additionally, the navigation script can be created, provided, and used in a format shared by various devices, and is easy to be distributed via a network or an electronic medium, etc. Also its duplication can be made with ease.
  • the navigation script can describe various navigation information related to time and point. The navigation information related to a point and a route, such as “This facility is famous for ⁇ ), can be described. Also the navigation information about time, such as “Informed ten minutes prior to arrival” can be described.
  • FIG. 1 is a block diagram exemplifying the configuration of a navigation information presenting apparatus according to the present invention.
  • an instruction sequence composed of data (such as text data, image data, voice data, etc.) of time, point, and information for guidance, which are stored in various formats, is described in a markup language description format.
  • An instruction is a unit of a script composed of navigation information including times (such as a departure time, en-route times, an arrival time, a start time, an end time, etc.), and points (such as a departure point, en-route spots, a destination, an intersection, a transfer point, a facility location, etc.), and one shot or a portion of various media data (a map, text, voice, music, an image, a video, etc.).
  • the instruction is, for example, a directive for outputting voice data (aaa.wav) and image data (xxx.jpg), which explain a point A, at the point A on a certain route.
  • Such an instruction sequence which is described in a markup language such as an XML (Extensible Markup Language) (“Extensible Markup Language (XML) 1.0,” World Wide Web Consortium (W3C) Recommendation, REC-xml-19980210, Feb. 10, 1998. http://www.w3.org/TR/1998/REC-xml-19980210) format, is referred to as a navigation script, according to the present invention.
  • the navigation script is abbreviated to a naviscript in the remaining portion of this specification.
  • the naviscript is stored and managed by a center 40 .
  • the naviscript is stored in various media such as a magnetic disk, a CD-ROM, etc., and is read from a user terminal 10 .
  • An operation inputting unit 11 of the user terminal 10 selects a naviscript script from among the naviscripts stored in the center 40 and/or various media 32 via a network accessing unit 12 and/or a medium accessing unit 13 in response to a retrieval request issued by a user, and passes the selected naviscript and/or a naviscript directly input by a user to a script converting unit 14 .
  • the script converting unit 14 parses the naviscript, and converts it into structured navigation data.
  • an instruction processing unit 15 obtains the current state (current time, point, etc.) of the user from the state acquiring unit 16 , and complements the route information of the structured navigation data. Navigation information is then output from a navigation outputting unit 18 based on the structured navigation data according to the obtained state.
  • the instruction processing unit 15 obtains a virtual current time and point from a state generating unit 17 , and complements the route information of the structured navigation data. Navigation information is then output from the navigation outputting unit 18 .
  • naviscript for navigating, for example, a ⁇ Tour on a route from Tokyo Station to the Rainbow Bridge via Kyobashi IC (InterChange), and the naviscript describes the following instructions.
  • the operation inputting unit 11 reads this naviscript from the center 40 via a network 31 , etc. and starts to execute the naviscript, according to a user instruction.
  • the script converting unit 14 then generates structured navigation data by converting the naviscript.
  • the instruction processing unit 15 first extracts the descriptions of points and the route, which are included in the instructions, based on the structured navigation data, and displays the summary of the route by referring a database 20 storing map information, etc. Then, the instruction processing unit 15 obtains the current position or time from the state acquiring unit 16 of a GPS, etc., and processes the instructions based on the obtained point or time.
  • the navigation outputting unit 18 outputs the voice data “Tokyo Station” when the user reaches Tokyo Station, outputs the voice message “Welcome to ⁇ Tour” 2 minutes later, and displays the image data of the summary of the tour. Furthermore, the navigation outputting unit 18 outputs the voice data “Kyobashi IC” at Kyobashi IC, the voice data “Rainbow Bridge Soon Ahead” 3 km before the Rainbow Bridge, and the voice data “Rainbow Bridge” upon arrival at the Rainbow Bridge. Accordingly, the user can obtain the helpful navigation information at the suitable spots at the suitable timing while moving on the route of the ⁇ Tour.
  • naviscript For such a naviscript, an instruction sequence relating to time, places, and information for guidance is described in a markup language description format. A generated naviscript is easy to be read and written similar to an existing mark up language, thereby facilitating retrieval and processing. Accordingly, the meaning of the data of a naviscript, and whether or not its instruction sequence is described in an order to be executed are made clear to a naviscript generator.
  • naviscript obtained from the center 40 , etc. is converted into structured navigation data corresponding to a local terminal itself, one naviscript can be shared by various devices and systems.
  • navigation information is presented according to an instruction sequence (a time sequence and/or a point sequence), thereby obtaining navigation information, which is more suitable for a state, at suitable timing.
  • navigation information can be obtained on an actual route.
  • simulation mode the navigation of a certain route can be virtually experienced.
  • the naviscript can be easily created and edited by using an existing text editor, and a created naviscript can be registered to a center, etc., so that everybody can obtain navigation information everywhere by using the naviscript via a network, etc.
  • FIG. 2 explains the process performed by a script editing unit 41 . Since the naviscript is described in a markup language, it can be edited by using a normal text editor. As shown on a naviscript editing screen 42 illustrated in FIG. 2, a naviscript can be also generated/edited with GUI by editing and inputting a route, etc. on a map with the use of the map information obtained from a map information database 44 , and by converting the information on the naviscript editing screen 42 into a naviscript described in a markup language with the use of a translator 43 which converts graphic information such as a map, etc. into text information.
  • a naviscript can be also generated/edited with GUI by editing and inputting a route, etc. on a map with the use of the map information obtained from a map information database 44 , and by converting the information on the naviscript editing screen 42 into a naviscript described in a markup language with the use of a translator 43 which converts graphic
  • the translator 43 has not only a capability for converting a map image into a naviscript, but also a capability for converting a naviscript stored in a buffer/file 45 into information to be displayed on a map.
  • a naviscript editing tool can be easily implemented likewise a home page creation tool of the Internet.
  • the editing tool can be utilized not only by the center 40 , but also by a personal computer that a general user possesses.
  • the naviscript language adopted in this preferred embodiment is a markup language for describing a naviscript, which is newly defined as a subset of an XML (Extensible Markup Language)(“Extensible Markup Language (XML) 1.0,” World Wide Consortium (W3C) Recommendation, REC-xml-19980210, Feb. 10, 1998. http://www.w3.org/TR/1998/REC-xml-19980210) laid down by W3C (World Wide Web Consortium).
  • XML Extensible Markup Language
  • W3C World Wide Consortium
  • a tag which does not start “ ⁇ /” is called a start tag, while a tag starts with “ ⁇ /” is called an end tag.
  • a pair is hereinafter referred to as a tag set.
  • “id” included in ⁇ inst id-inst-01> is defined to be an attribute of a tag
  • inst-01” is defined to be the value of the attribute.
  • Instructions enclosed by ⁇ seq> and ⁇ /seq> perform navigation sequentially, while instructions enclosed by ⁇ par> and ⁇ /par> perform navigation in parallel. Similarly, instructions enclosed by ⁇ time-optimal> and ⁇ /time-optimal> perform navigation in an optimal order of required time. Instructions enclosed by ⁇ distance-optimal> and ⁇ /distance-optimal> perform navigation in an optimal order of a required distance. Instruction enclosed by ⁇ cost-optimal> and ⁇ /cost-optimal> perform navigation in an optimal order of a required cost.
  • [0142] is a relative time display “5 seconds after a preceding instruction”.
  • [0148] are a direct and absolute point display with a coordinate of longitude and latitude.
  • [0152] are an indirect and absolute point display with a name, an address, and a telephone number.
  • [0154] is a relative point display “1 km beyond a preceding point”.
  • [0156] is a relative point display “1 km before a succeeding point”.
  • [0160] are an indirect point range display with a name, an address, and a zip code.
  • [0170] is route information specification with a function.
  • a condition of whether or not to execute each instruction can be described depending on whether or not the information about a navigation user, a move method, an environment, etc. are equal to certain values, or belong to a certain range (set).
  • the information about a navigation user which are used for the condition of whether or not to execute an instruction, include the following items: sex, age, date of birth, blood type, single/married/divorced/bereaved, the number of children, family members, address, legal domicile, place of employment, occupation (business category, occupation type, title), height, weight, figure, physical ability, disease, handicap (sense of sight, sense of color, sense of hearing, taste, language, body), character, hobby, liking (liquor, tobacco, having a sweet tooth/drinking, Japanese-style food/Western food, fish/meat, etc.), driving years, accident history, violation history, temperature, blood pressure, pulsation, heart beat rate, brain waves, eyeball movement, driving time, driver, fellow passenger, etc.
  • the information about a move method include: a type (walking, bicycle, two-wheeled vehicle, car, bus, train, ship, airplane, etc.), position, speed, acceleration, direction, angular velocity, angular acceleration, altitude, residual quantity of gas, light ON/OFF, windshield wiper ON/OFF, room lamp ON/OFF, air conditioner ON/OFF, radio/TV ON/OFF, car navigation system ON/OFF, air flow, sound volume, checking/mobile inspection time, car type, displacement, automobile maker, right/left handle, etc.
  • a type walking, bicycle, two-wheeled vehicle, car, bus, train, ship, airplane, etc.
  • position speed, acceleration, direction, angular velocity, angular acceleration, altitude
  • residual quantity of gas light ON/OFF, windshield wiper ON/OFF, room lamp ON/OFF, air conditioner ON/OFF, radio/TV ON/OFF
  • car navigation system ON/OFF air flow, sound volume, checking/mobile inspection time, car type, displacement, automobile maker, right/left handle, etc.
  • the information about an environment include weather (fine/cloudy/rain/snow, rainy season/typhoon), temperature, humidity, atmospheric pressure, probability of rainfall, UV index, photochemical smog index, noise index, traffic jam state, regulation information, accident information, etc.
  • text data, voice data, sound data, image data, video data, etc. can be specified as navigation information as follows.
  • Yaesu Central Exit > ⁇ point> ⁇ name> Tokyo Sta.
  • the contents of the respective instructions are defined after ⁇ /navi>.
  • the initial instruction “inst-info-opening” instructs a navigation message “Welcome to Rainbow Town Tour” to be vocally output after 5 seconds elapse from a departure time.
  • a naviscript like the above described one is converted into structured navigation data by the script converting unit 14 .
  • An example of structured navigation data into which the above described naviscript is converted is provided below.
  • FIGS. 3 and 4 show a portion of the above described structured navigation data in the form of a table.
  • the voice navigation “Welcome to Rainbow Town Tour” is first performed. Then, the route navigation from Tokyo Sta. Yaesu Central exit to Daiba IC via Kyobashi IC and Edobashi JC is performed after the train navigation from Kaihinmakuhari Sta. to Tokyo Sta.
  • the Adtext “Rainbow Bridge in 10 minutes” is displayed 10 minutes before a scheduled time at which Edobashi JC is passed through. Additionally, if the arrival time at Daiba IC is between 11:30 and 13:30, the information about restaurants is presented. If the arrival time at Daiba IC is before 11:30 or after 13:30, the information about cafes is presented.
  • FIG. 1 the arrival time at Daiba IC is between 11:30 and 13:30.
  • the operation inputting unit 11 obtains a naviscript stored in the center 40 or the medium 32 , or a naviscript input by a user.
  • the flow of the process performed by the operation inputting unit 11 is shown in FIG. 5.
  • the operation inputting unit 11 accesses the center 40 via the network 31 with the use of the network accessing unit 12 , and/or accesses the medium 32 storing naviscripts with the use of the medium accessing unit 13 .
  • a desired naviscript is retrieved and selected according to a user instruction, or a naviscript is directly input by a user, so that the operation inputting unit 11 receives the naviscript (step S 11 ).
  • the operation inputting unit 11 passes the received naviscript to the script converting unit 14 (step S 12 ). Although the naviscript itself is received from the medium 32 at this time, an external image file, etc. specified with URL within the naviscript may be sometimes received via the network 31 .
  • the script converting unit 14 converts a naviscript described in a markup language into structured navigation data.
  • the flow of the process performed by the script converting unit 14 is shown in FIG. 6.
  • the script converting unit 14 receives a naviscript from the operation inputting unit 11 (step S 21 ), converts the received naviscript into structured navigation data (step S 22 ), and passes the data to the instruction processing unit 15 (step S 23 ).
  • the script converting unit 14 can convert a naviscript into not only structured data referenced by the instruction processing unit 15 , but also various types of structured data used by a local system or other devices, etc. Accordingly, it is possible, for example, to make a scheduler display a time instruction after being passed to the scheduler unchanged or after being converted, or to display on a map the information obtained by converting a place instruction into a map description script.
  • the instruction processing unit 15 processes the instructions included in structured navigation data according to the current state of a user or a virtually set state for simulation after complementing the information in an unspecified portion of the route information of the structured navigation data received from the script converting unit 14 .
  • the instruction processing unit 15 performs the process shown in FIG. 7 as a preparation process of the instruction processing, and further performs the process shown in FIG. 8 as an execution process.
  • the instruction processing unit 15 determines whether the execution mode set by a user is either a navigation mode or a simulation mode (step S 32 ). If the instruction processing unit 15 determines that the execution mode is the navigation mode, it makes the state acquiring unit 16 acquire a state (an actual current time/point), and obtains the state (step S 33 ). Then, the instruction processing unit 15 adds the actual current point to the beginning of the structured navigation data (step S 34 ). The flow then goes to step S 35 .
  • the instruction processing unit 15 determines that the execution mode is the simulation mode, it issues a request to prepare a state to the state generating unit 17 and a further request to generate a state upon completion of the initial request.
  • the instruction processing unit 15 then obtains the state (virtual current time and point) (step S 42 ), and adds the virtual current position to the beginning of the structured navigation data (step S 43 ).
  • the instruction processing unit 15 attaches a flag indicating “original” to all of the instructions (step S 35 ).
  • This flag is intended to make a distinction between an instruction originally included in a naviscript and an instruction newly added by a complementing process to be described later.
  • the instruction processing unit 15 complements the information about a place within the structured data (step S 36 ).
  • an attribute or attributes not described in the naviscript among various attributes such as latitude, longitude, altitude, a name, an address, a phone number, a zip code, etc. are retrieved from the database unit 20 with a described attribute as a key. If only an area is specified, the attribute of a representative spot in the area is retrieved.
  • spots representative of Shinjuku Ward such as Shinjuku Ward Office, Shinjuku Station, etc.
  • spots representative of Mt. Fuji such as Top of Mt. Fuji, an entry point of a Mt. Fuji route etc., are retrieved from the database unit 20 .
  • the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using an evaluation index.
  • the instruction processing unit 15 describes a retrieved/selected attribute in a corresponding portion of the structured navigation data.
  • the instruction processing unit 15 complements the information about a route within the structured navigation data (a portion where the route is not specified, etc.) (step S 37 ).
  • a route is not specified in an item of a route, or if a category (such as a normal road, a toll road, a highway, time precedence, distance precedence, straight drive precedence, wider road precedence, etc.) is specified, the instruction processing unit 15 retrieves a route. If a plurality of retrieval results are obtained, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index.
  • the instruction processing unit 15 additionally describes the retrieved/selected route to the corresponding portion of the structured navigation data, and attaches a flag indicating “addition” to the instruction of the route. If the route is entirely specified, the instruction processing unit 15 determines whether this route is either available or unavailable. If the instruction processing unit 15 determines that the route is unavailable, it retrieves another route. If a plurality of retrieval results exist also in this case, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index. The instruction processing unit 15 then modifies the corresponding portion of the structured navigation data by describing the retrieved/selected route therein, and attaches a flag indicating “addition” to the instruction of that route.
  • the instruction processing unit 15 converts all of relatively specified places into absolutely specified places (step S 38 ), estimates expected arrivals times at the respective places (step S 39 ), and rearranges all of the instructions in time order (step S 40 ).
  • the instruction processing unit 15 sets an instruction cursor pointing the instruction to be executed next on the initial instruction (step S 41 ). The flow then proceeds to the execution process shown in FIG. 8.
  • the instruction processing unit 15 first determines whether the execution mode is either navigation mode or simulation mode (step S 51 ). If the instruction processing unit 15 determines that the execution mode is the navigation mode, it makes the state acquiring unit 16 acquire a state (an actual current time and point), and obtains the state (step S 52 ). If the instruction processing unit 15 determines that the execution mode is the simulation mode, it requests the state generating unit 17 to generate a state (a virtual current time and point), and obtains the state (step S 59 ).
  • the instruction processing unit 15 determines whether or not the current position is on the specified route in the instruction indicated by the instruction cursor. If the processing unit 15 determines that the current position is not on the route, it complements the information about the route of the structured navigation data (step S 53 ). When the instruction processing unit 15 complements the information about the route, it retrieves a route from the current position to a near point on the specified route. If a plurality of retrieval results exist, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index. The instruction processing unit 15 then additionally describes the retrieved or selected route to the corresponding portion of the structured navigation data, and attaches a flag indicating “addition” to the instruction of that route.
  • the instruction processing unit 15 determines that the execution mode is the navigation mode (step S 54 ), it obtains the information corresponding to the information separately set by the user (step S 55 ). Assuming that the user makes a presetting such that traffic information is used, the instruction processing unit 15 makes the state acquiring unit 16 acquire the traffic information (such as the information about a traffic jam, a traffic regulation, an accident, etc.), and obtains the traffic information (step S 55 ).
  • the instruction processing unit 15 then complements the information about a route based on the information obtained in step S 55 , etc. for the route to which the flag indicating “addition” is attached (step,S 56 ).
  • the flag of the route specified by the instruction indicated by the instruction cursor is “addition”, if a traffic jam, regulation, or accident occurs at the nearest indispensable point or on the nearest indispensable route, which is included in the instructions subsequent to that indicated by the instruction cursor, and if an automatic route change setting separately made by the user is ON, the instruction processing unit 15 retrieves a route from the current position to the indispensable point or route.
  • the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index.
  • the instruction processing unit 15 modifies the corresponding portion of the structured navigation data by describing the retrieved or selected route therein, and attaches a flag indicating “addition” to the instruction of the route.
  • the instruction processing unit 15 If the time and/or point described in the instruction indicated by the instruction cursor matches an actual current time and point (in the navigation mode) or a virtual current time and point (in the simulation mode) (or the time and/or point is within an area including an error caused by sampling, etc.), the instruction processing unit 15 passes the navigation information to the navigation outputting unit 18 (step S 57 ), and updates the point of the instruction cursor to that of the next instruction (step S 58 ). The instruction processing unit 15 repeats the above described process until there is no instruction left (step S 59 ).
  • the state acquiring unit 16 acquires a state such as a current time and point, or various information such as traffic information, etc.
  • the state acquisition process and the information acquisition process, which are performed by the state acquiring unit 16 are respectively shown in FIGS. 9 and 10.
  • the state acquiring unit 16 upon receipt of the request to acquire a state, which is issued from the instruction processing unit 15 , acquires an actual current time and point, and passes the acquired time and point to the instruction processing unit 15 (step S 61 ).
  • the state acquiring unit 16 upon receipt of the request to acquire information from the instruction processing unit 15 , acquires the information such as traffic information, etc. depending on need by using a suitable communication means, and passes the acquired information to the instruction processing unit 15 (step S 62 ).
  • the state generating unit 17 prepares and generates the values required for the simulation mode, such as a virtual current time and point, etc.
  • the state preparation process and the state generation process, which are performed by the state generating unit 17 are respectively shown in FIG. 11 and 12 .
  • a virtual departure time is set to either of an actual current time and a time that the user separately sets, which is selected by the user or the system (step S 71 ), upon receipt of the request to prepare a state from the instruction processing unit 15 , as shown in FIG. 11.
  • the state generating unit 17 performs the process for setting a virtual departure point to a point that the user or the system selects from among an actual current point, a point that the user separately sets (such as a user home), and the initial point that first appears in the structured navigation data (step S 72 ).
  • the state generating unit 17 sets a virtual time elapse speed to either of a default virtual time elapse speed set by the system and a virtual time elapse speed that the user separately sets, which is selected by the user or the system (step S 73 ).
  • a virtual place move speed for each place move means such as walking, a bicycle, a car, etc. is set to either of a default virtual place move speed that the system sets and a virtual place move speed that the user separately sets, which is selected by the user or the system (step S 74 ).
  • a simulation sampling time period is set to either of a default simulation sampling time period that the system sets and a simulation sampling time period that the user separately sets, which is selected by the user or the system (step S 75 ).
  • the state generating unit 17 sets the virtual departure time to a virtual current time (step S 76 ), and the virtual departure point to a virtual current point (step S 77 ).
  • a virtual current time and point are passed to the instruction processing unit 15 (step S 81 ), and the simulation sampling time period is added to the virtual current time, so that the virtual current time is updated (step S 82 ). Furthermore, the virtual current point is updated (step S 83 ). Namely, the virtual current point is updated to a point obtained by being proceeded on the route that the instruction currently being executed specifies, by the distance calculated by multiplying the virtual place move speed of the place moving means that the instruction currently being executed specifies, by the simulation sampling time period. Note that, however, the virtual current point is updated to the end point of the route if it is beyond the range of the route.
  • the navigation outputting unit 18 outputs navigation (information) based on a naviscript.
  • FIG. 13 shows the flow of the process performed by the navigation outputting unit 18 .
  • the navigation outputting unit 18 Upon receipt of the request to output navigation from the instruction processing unit 15 , the navigation outputting unit 18 outputs the corresponding navigation (step S 91 ).
  • the above described format of structured data corresponds to one of such specifications. Accordingly, if the instruction sequence is given in the format of structured data initially, it may be input to the instruction processing unit 15 unchanged.
  • a naviscript can be edited by a normal text editor or a naviscript editing tool using a translator 43 equipped with a capability for converting a naviscript into the information of a point or a route on a map.
  • a naviscript can be semi-automatically generated based on a route on which a user actually walks or drives a car.
  • a script semi-automatically generating unit 19 is intended to semi-automatically generate a naviscript by obtaining times at respective points on a route, position information such as latitude, longitude, etc., and time series data of information for guidance, etc. Since the naviscript is described in a markup language such as the XML, etc. as described above, it can be created by a general-purpose text editor, word processor, etc. By enabling a naviscript to be semi-automatically generated based on a route on which a user actually moves, even a person unfamiliar with a markup language etc. can create the naviscript with ease.
  • FIGS. 14A and 14B explain the process for semi-automatically generating a naviscript.
  • the script semi-automatically generating unit 19 obtains as time series data the time at a specified point and point information, and a varying time and the point information such as latitude, longitude, etc. from the state acquiring unit 16 .
  • the navigation information such as voice data, image data, etc. are inserted in suitable times and/or points according to user instructions.
  • generating unit 19 may be arranged not only in the user terminal 10 , but also in the center 40 or a different portable terminal, etc.
  • FIG. 15 exemplifies the configuration of this system.
  • a personal computer 100 corresponds to the user terminal 10 shown in FIG. 1.
  • a Web center 200 corresponds to the center 40 shown in FIG. 1.
  • An input processing unit 131 corresponds to the operation inputting unit 11 , the network accessing unit 12 , and the medium accessing unit 13 , which are shown in FIG. 1.
  • An output processing unit 132 corresponds to the navigation outputting unit 18 shown in FIG. 1.
  • a script converting unit 134 corresponds to the script converting unit 14 .
  • An instruction processing unit 135 corresponds to the instruction processing unit 15 .
  • a time/point generating unit 136 corresponds to the state generating unit 17 .
  • a map/information managing unit 140 and a voice data managing unit 150 correspond to the database unit 20 .
  • a clock 160 and a GPS (Global Positioning System) 170 or a PHS position detecting unit 180 correspond to the state acquiring unit 16 .
  • a naviscript system 120 for navigation is built, for example, into a Web browser 110 of the personal computer 100 as plug-in software.
  • FIG. 16 shows an example of a home page screen of a naviscript service provided by a Web center. Supposing that the URL of the home page of the naviscript service is specified on the screen of the Web browser 110 , a menu screen like the one shown in FIG. 16 and described in an HTML (HyperText Markup Language) is delivered from the Web center 200 , and displayed on the display screen of the personal computer 100 .
  • HTML HyperText Markup Language
  • naviscript system 120 may be activated by Web browser 110 accessing a file based on a particular file extension (such as “.nav”) given to a naviscript after being downloaded.
  • FIG. 18 exemplifies the screen displayed by a naviscript browser 130 . This screen is displayed by being embedded into the screen displayed by the Web browser 110 .
  • the naviscript for the course which starts from “Tokyo ⁇ City” and is received by the input processing unit 131 of the naviscript browser 130 from the Web center 200 via the Web browser 110 , is passed to the script converting unit 134 .
  • the script converting unit 134 converts the naviscript into structured navigation data, and passes the converted data to the instruction processing unit 135 .
  • the instruction processing unit 135 performs the following process as a preparation process of instruction execution.
  • the instruction processing unit 135 determines whether the execution mode set by a user is either navigation mode or simulation mode. If the instruction processing unit 135 determines that the execution mode is the navigation mode, it issues a request to acquire a state to the clock 160 and the GPS 170 or the PHS position detecting unit 180 , obtains an actual current time/point as a state, and adds the instruction for departing from the actual current position to the beginning of the structured navigation data.
  • the instruction processing unit 135 adds a flag indicating “original” to all of the information items such as a time, a point, a route, navigation information, etc. of each instruction within the structured navigation data, and retrieves from the map/information managing unit 140 an attribute or attributes yet to be described among various attributes relating to a place (such as latitude, longitude, altitude, a name, an address, a telephone number, a zip code, etc.) by using described attributes as keys. For example, if only an area such as “Shinjuku Ward” is specified, the attributes of places representative of this area, such as Shinjuku Station, Shinjuku Ward Office etc. are retrieved.
  • the instruction processing unit 135 inquires of a user which result to select by using a menu, etc., or selects one of them by using a suitable evaluation index, and describes the retrieved or selected attribute in the corresponding portion of the structured navigation data (if a plurality of retrieval results exist also in subsequent processes, they are processed in a similar manner). Note that such a complementing process is normally performed for a naviscript created by a user himself. If a naviscript is the one downloaded from a center, it is recognized to describe the navigation information of the entire course from the beginning. Accordingly, this process is omitted.
  • the instruction processing unit 135 retrieves a route if a route is not specified in the items relating to a route within the structured navigation data or, for example, only a category such as a normal road, a toll road, a highway, time precedence, distance precedence, etc. is specified. If an entire route is specified, the instruction processing unit 135 determines whether the route is either available or unavailable. If the instruction processing unit 135 determines that the route is unavailable, it retrieves a different route. The instruction processing unit 135 additionally describes the retrieved route to the corresponding portion of the instruction, and attaches a flag indicating “addition” to the route.
  • the instruction processing unit 135 converts all of relatively specified places into absolutely specified places, describes the converted places in corresponding portions of the structured navigation data, estimates expected arrival times at all of the places, and describes the expected time in corresponding portions of the structured navigation data.
  • the instruction processing unit 135 then rearranges all of the instructions in time order, and sets an instruction cursor on the initial instruction of the structured navigation data.
  • the instruction processing unit 135 determines that the execution mode is the simulation mode, it issues a request to prepare a state to the time/point generating unit 136 , further issues a request to generate a state upon completion of the process for preparing a state, obtains a virtual current time and point, and adds the instruction for departing from the virtual current point to the beginning of the structured navigation data. Thereafter, a preparation process similar to that in the navigation mode is performed.
  • the instruction processing unit 135 performs the following process as an execution process. Provided first is the explanation about the case where the execution mode is the navigation mode. If the instruction processing unit 135 determines that the execution mode is the navigation mode, it issues a request to acquire a state to the GPS 170 and the clock, and obtains the information about an actual current time and point.
  • the instruction processing unit 135 determines whether or not the current position is on a specified route. If the instruction processing unit 135 determines that the current position is not on the route specified by the instruction indicated by the instruction cursor, it retrieves a route from the current point to a near point on the route, additionally describes the retrieved route to the corresponding portion of the instruction, and attaches a flag indicating “addition” to the route. If a plurality of retrieval results exist, one of them is selected with user specification or by using a suitable evaluation index.
  • traffic information (such as a traffic jam, regulation, accident, etc.) is obtained from VICS (Vehicle Information Communication System).
  • VICS Vehicle Information Communication System
  • an automatic route change setting that the user separately makes is ON, if the flag of the route specified by the instruction indicated by the instruction cursor is “addition”, and if a traffic jam, regulation, accident, etc. occurs up to the nearest indispensable point or the nearest indispensable route, which the flag indicating “original” is attached to and included in the instructions subsequent to that indicated by the instruction cursor, the instruction processing unit 135 retrieves a route from the current point to the indispensable point or route.
  • the instruction processing unit 135 modifies the route by describing the retrieved or selected route in the corresponding portion of the instruction, and attaches a flag indicating “addition” to the modified route. Since the traffic information use setting or the automatic route change setting is normally used when the present invention is applied to a car navigation system, etc., a user turns off this setting when the user moves on foot as assumed in this example.
  • the instruction processing unit 135 issues a request to output navigation to the output processing unit 132 , passes navigation information thereto, and updates the point of the instruction cursor to that of the next instruction.
  • the instruction processing unit 135 repeats the above described process until there is no instruction left.
  • the instruction processing unit 135 issues a request to generate a state to the time/point generating unit 136 , obtains a virtual current time/point, and determines whether or not a virtual current point is on a specified route. If the instruction processing unit 135 determines that the virtual current point is not on the route specified by the instruction indicated by the instruction cursor, it retrieves a route to the virtual current point or to a near point, additionally describes the retrieved route in a corresponding portion of the instruction, and attaches a flag indicating “addition” to the route.
  • the instruction processing unit 135 issues a request to output navigation to the output processing unit 132 , passes navigation information thereto, and updates the point of the instruction cursor to that of the next instruction.
  • the instruction processing unit 135 repeats the above described process until there is no instruction left.
  • the time/point generating unit 136 sets as a virtual departure time either of an actual current time and a time that the user separately sets, which is selected by the user or the system, sets as a virtual departure point either of an actual current point or a point that the user separately sets, which is selected by the user or the system, and sets as a virtual time elapse speed either of a default virtual time elapse speed that the system sets and a virtual time elapse speed that the user separately sets, which is selected by the user or the system.
  • the time/point generating unit 136 further sets as a virtual place move speed either of a default virtual place move speed that the system sets and a virtual place move speed that the user separately sets, and sets as a simulation sampling time period either of a default simulation sampling time period that the system sets and a simulation sampling time period that the user separately sets, which is selected by the user or the system. Additionally, the time/point generating unit 136 respectively sets the virtual departure time and the virtual departure point to the virtual current time and the virtual current point.
  • the time/point generating unit 136 performs the following process as the state generation process. Upon receipt of the request to generate a state from the instruction processing unit 135 , the time/point generating unit 136 passes the virtual current time/point to the instruction processing unit 135 , and updates the virtual current time by adding the simulation sampling time period to the virtual current time. Still further, the time/point generating unit 136 updates the virtual current point to a point proceeded by a distance calculated by multiplying the virtual place move speed of the place moving means specified in the instruction currently being executed by the simulation sampling time period. However, if the calculated point is beyond the range of the route, the time/point generating unit 136 updates the virtual current point to an end point of the route.
  • the output processing unit 132 Upon receipt of the request to output navigation from the instruction processing unit 135 , the output processing unit 132 outputs navigation information according to the instruction indicated by the instruction cursor. As a result, the course navigation to the National Noh Theater is made on the screen shown in FIG. 18.
  • the output processing unit 132 displays the outline of a route and a distance or time required to reach a destination, or performs navigation vocally and/or by displaying a navigation message or image on the screen shown in FIG. 18 according to the instructions proceeding from the current time or point, while a user is moving on the route that the user himself selects.
  • the output processing unit 132 displays a navigation message or image on the screen shown in FIG. 18 or performs voice navigation according to instructions, based on a set virtual current time/point, and virtual time elapse speed.
  • FIG. 19 exemplifies the system configuration in the case where the present invention is applied to a car navigation system.
  • a center 210 corresponds to the center 40 shown in FIG. 1.
  • a navigation outputting unit 302 corresponds to the navigation outputting unit 18 shown in FIG. 1, and the remaining portion of an input/output processing unit 301 corresponds to the operation inputting unit 11 , the network accessing unit 12 , and the medium accessing unit 13 shown in FIG. 1.
  • a script converting unit 303 corresponds to the script converting unit 14 .
  • An instruction processing unit 304 corresponds to the instruction processing unit 15 .
  • a time/point generating unit 305 corresponds to the state generating unit 17 .
  • a map/information managing unit 310 and a voice data managing unit 320 correspond to the database unit 20 .
  • a clock 160 , a GPS 170 , and a VICS 190 correspond to the state acquiring unit 16 .
  • the input/output processing unit 301 specifies with a menu, etc. the course on which a user desires to drive, and issues a retrieval request to the center 210 .
  • the input/output processing unit 301 downloads a desired naviscript from the center 210 , it passes the naviscript to the script converting unit 303 .
  • the script converting unit 303 converts the naviscript into structured navigation data, and passes the converted data to the instruction processing unit 304 . Thereafter, once a navigation start instruction is issued, the instruction processing unit 304 prepares for an instruction process based on the structured navigation data, and executes instructions.
  • naviscript according to the present invention, also navigation in terms of time can be made according to a current time or an elapsed time. Furthermore, a user can create a naviscript that the user himself or an acquaintance uses, set the created naviscript in a car navigation system, and operate the system.
  • FIG. 20 exemplifies the system configuration in the case where the present invention is applied to a PDC or a PHS.
  • a naviscript 510 is built in a center 500 .
  • a PHS browser 610 of a PHS 600 comprises an input processing unit 611 and an output processing unit 612 .
  • a user issues a request to retrieve a naviscript to a Web server 700 to which naviscripts are recorded, by using the PHS browser 610 of the PHS 600 via a PHS browsing server 520 within a PHS center 500 .
  • An output processing unit 522 within the PHS browsing server 520 downloads a desired naviscript from the Web server 700 , and passes the downloaded naviscript to the input processing unit 521 .
  • the input processing unit 521 passes the naviscript to the script converting unit 523 .
  • the script converting unit 523 parses the naviscript, and converts it into structured navigation data.
  • the instruction processing unit 524 When the user uses the naviscript in the navigation mode, the instruction processing unit 524 obtains the current state (current time/point) of the user from a clock 620 and a PHS position detecting unit 630 of the PHS 600 , and complements the route information of the structured navigation data. The instruction processing unit 524 then obtains required map/information and voice data from a map/information managing unit 530 and a voice data managing unit 540 based on the structured navigation data according to the state, and passes the obtained information and data to an output processing unit 522 . The output processing unit 522 outputs navigation by making it viewable on a display screen of the PHS browser 610 , etc. via the PHS browsing server 520 .
  • the instruction processing unit 524 complements the route information of the structured navigation data by obtaining a virtual current time/point from a time/point generating unit 525 , and outputs navigation by making it viewable on the display screen of the PHS browser 610 in a similar manner as in the navigation mode.
  • a conventional driving managing system which comprises: an inputting unit for inputting data describing an itinerary of a trip desired by a user, a service timetable, or a route; a driving management database describing reservation states of respective highways and facilities, etc., and data such as a traffic jam on a road or in a parking lot, regulations, accident, weather, etc.; and a coordinating unit for making a comparison and coordination between the data of an input desired itinerary/route and that of the driving management database, for modifying the itinerary/route data on demand according to the result of the comparison/coordination, and/or for updating the data of the driving management database; and an outputting unit for outputting the resultant itinerary/route data.
  • a point/route navigating apparatus which is another implementation of a conventional technique, for performing various types of navigation in a car navigation system, a PC, a PDA, a PDC, a PHS, etc.
  • Such an apparatus comprises: an inputting unit for inputting a point/route (sequence) desired by a user, and an executing unit for performing navigation according to the input point/route (sequence).
  • the format of the itinerary/route data for a reservation in the conventional driving managing system is different from that of the point/route data for navigation in the conventional point/route navigating apparatus, these data must be separately created, managed, and operated, which causes an inconvenience to a developer, an operator, and a user. Additionally, because a data format is different depending on each driving managing system, many types of data must be created, managed, and operated, which also causes an inconvenience to a developer, an operator, and a user.
  • the formats of data input for making reservations in various driving managing systems can be made common to those of the data input for performing navigation in various point/route navigating apparatuses.
  • FIG. 21 exemplifies the configuration in the case where the present invention is applied to a driving managing system.
  • a driving managing center 1000 and a terminal 1010 used by a user are interconnected by a network, and transmit/receive a naviscript.
  • a driving management database 1004 obtains various information items such as a road state, a reservation state of each facility from various information providing sources 1020 , and manages the obtained information.
  • the terminal 1010 itself may be a portable information device, or an information device to be built into a car navigation system, etc.
  • the driving managing center 1000 comprises: a receiving unit 1001 for receiving a naviscript transmitted from the terminal 1010 ; a converting unit 1002 for converting a received naviscript into structured navigation data; a coordinating unit 1003 for making a comparison and coordination between the converted structured navigation data and the data stored in the driving management database 1004 ; an inversely converting unit for inversely converting the coordinated structured navigation data into a naviscript; and a transmitting unit 1006 for returning the converted naviscript to the terminal 1010 .
  • the terminal 1010 comprises: an input processing unit 1011 for inputting a naviscript desired by a user from a source in a network 1030 , a medium 1031 such as a CD-ROM, a magnetic disk, etc., or a keyboard, and the like; a transmitting unit 1012 for transmitting the input naviscript to the driving managing center 1000 ; a receiving unit for receiving the naviscript which is modified and transmitted by the driving managing center 1000 ; a converting unit 1014 for converting the received naviscript after being modified into structured navigation data which can be executed by the local terminal itself; an execution processing unit 1015 for generating navigation (information) based on the converted structured navigation data; and an output processing unit 1016 for outputting the generated navigation (information).
  • an input processing unit 1011 for inputting a naviscript desired by a user from a source in a network 1030 , a medium 1031 such as a CD-ROM, a magnetic disk, etc., or a keyboard, and the like
  • the input processing unit 1011 within the terminal 1010 obtains from the network 1030 or reads from the medium 1031 such as a CD-ROM, a magnetic disk, etc. a desired naviscript which describes the destination currently being headed and a scheduled itinerary or route of a trip or an operation, or inputs user instruction information with a keyboard.
  • the medium 1031 such as a CD-ROM, a magnetic disk, etc.
  • a desired naviscript which describes the destination currently being headed and a scheduled itinerary or route of a trip or an operation, or inputs user instruction information with a keyboard.
  • naviscript An example of a naviscript that a user desires is provided below.
  • the contents of the naviscript indicate an overnight trip to the Lake Yamanaka, and specify a course which is bound from Numazu to Gotenba on Tomei Highway by car, drops in at Fuji ⁇ Land, and reaches Lake Yamanaka ⁇ Lodge for an overnight stay on December 23rd. A required time is scheduled to be 6 hours and 30 minutes.
  • ⁇ naviscript version “0.3”> ⁇ title> example ⁇ /title> ⁇ copyright> All Rights Reserved, Copyright (C) FujiLabo Ltd. 1998.
  • the transmitting unit 1012 transmits this naviscript to the driving managing center 1000 .
  • the receiving unit 1001 receives the naviscript transmitted from the terminal 1010 , and the converting unit 1002 converts the received naviscript into structured navigation data.
  • Data 1 and data 2 exemplify route data
  • data 3 exemplifies point data
  • data 4 and data 5 exemplify facilities data.
  • a maximum number of cars/people indicates the maximum number of cars or people that can utilize a road or a facility.
  • a reserved number of cars/people indicates the number of cars/people that make reservations.
  • Jam percentage 100% represents the state where the reserved number of cars/people that make reservations reaches the maximum number, and no more use is allowed.
  • the coordinating unit 1003 makes a comparison and coordination between the structured navigation data received from the converting unit 1002 and the data such as the states of highways, the states of facilities, a traffic jam on a road or in a parking lot, a regulation, an accident, weather, which are described in the driving management database 1004 , modifies the structured navigation data based on the result of the comparison and coordination, and/or updates the data stored in the driving management database 1004 .
  • the coordinating unit 1003 makes a comparison between the above described naviscript that the user desires and ⁇ data 2 > stored in the driving management database 1004 , determines that the jam percentage of Tomei Highway is already 100% and this highway is unavailable, and also determines that the jam percentage of the reservation state of Fuji ⁇ Land is 100% and admission is not allowed. Therefore, the coordinating unit 1003 modifies the corresponding portion of the converted structured navigation data, and/or changes the data of the driving management database 1004 .
  • the contents are modified so that a national road is used instead of Tomei Highway, and Fuji ⁇ Land is not dropped in.
  • the following portion (indicated by the corresponding description of the naviscript) of the structured navigation data ⁇ route> ⁇ means> car ⁇ /means> ⁇ name> Tomei Highway ⁇ /name> ⁇ category> highway ⁇ /category> ⁇ /route> is modified as follows: ⁇ route> ⁇ means> car ⁇ /means> ⁇ name> Route 246 ⁇ /name> ⁇ category> national road ⁇ /category> ⁇ /route> Additionally, the following portion is deleted.
  • ⁇ inst id “inst-object-Fuji ⁇ Land”> ⁇ time> 11:00 ⁇ /time> ⁇ object> ⁇ name> Fuji ⁇ Land ⁇ /name> ⁇ category> amusement park ⁇ /category> ⁇ /object> ⁇ route> thesame ⁇ /route> ⁇ /inst>
  • the inversely converting unit 1005 converts the structured navigation data modified by the coordinating unit 1003 into a naviscript.
  • An example of the naviscript which is modified by the coordinating unit 1003 and inversely converted by the inversely converting unit 1005 is provided below.
  • the following naviscript indicates the course from Numazu to the Lake Yamanaka for an overnight stay via Gotenba ⁇ Intersection by using Routes 246 and 136 .
  • ⁇ naviscript version “0.3”> ⁇ title> example ⁇ /title> ⁇ copyright> All Rights Reserved, Copyright (C) FujiLabo Ltd.
  • the transmitting unit 1006 returns the converted naviscript after being modified to the terminal 1010 .
  • the receiving unit 1013 receives the naviscript transmitted from the driving managing center 1000
  • the converting unit 1014 converts the received naviscript into structured navigation data
  • the execution processing unit 1015 generates navigation (information) based on the structured navigation data
  • the output processing unit 1016 outputs the generated navigation information with a display, a printer, a speaker, etc.
  • FIG. 22 exemplifies a naviscript GUI editor screen displayed on the terminal 1010 .
  • An editor screen 1100 is used to create a naviscript that a user desires, and comprises an operation menu 1101 for selecting and instructing various types of editing operations, and a map operation icon 1102 for moving a display area on a map displayed on a map display area 1103 .
  • the user creates his or her desired naviscript by using the GUI which links with the map information displayed in the map display area 1103 .
  • the naviscript is represented as a tree structure composed of a course summary portion 1104 , a course details portion (navi-tag portion) 1105 , (each) instruction portion (inst-tag portion) 1106 , etc., and is displayed on the left-hand side of the map display area 1103 .
  • FIG. 23 An example of a naviscript browser screen displayed on the terminal 1010 is illustrated in FIG. 23.
  • the browser screen 1110 is displayed when the naviscript after being modified, which is transmitted from the driving managing center 1000 , is verified in the simulation mode or executed in the navigation mode.
  • areas such as an information display area 1112 for displaying position information at each point, a map display area 1113 for displaying a current position and route of a user on a map, a latitude/longitude display area 1114 , a text display area 1115 for displaying text data information included in navigation information, and an image display area 1116 for displaying image data information included in navigation information, and a map moving button 1121 for moving a display area on a map, a reduction scale changing button 1122 for changing a reduction scale of a map display, a various-types setting button 1123 for setting simulation mode, navigation mode, etc., a simulation start button 1124 , a fast-forward button 1125 , an end button 1126 , a reset button 1127 , etc.
  • FIG. 24 shows the flow of the process performed by a terminal when the present invention is applied to a driving managing system.
  • the transmitting unit 1012 transmits the naviscript to the driving managing center 1000 (step S 102 ).
  • the terminal 1010 receives a modified naviscript from the driving managing center 1000 with the receiving unit 1013 (step S 103 ), and the converting unit 104 converts the received naviscript into structured navigation data to be used in the local terminal 1010 with the use of the converting unit 1014 (step S 104 ).
  • the execution processing unit 1015 then creates navigation information based on the structured navigation data (step S 105 ), and the output processing unit 1016 outputs the generated navigation information (step S 106 ).
  • FIG. 25 shows the flow of the process performed by a driving managing center when the present invention is applied to the driving managing system.
  • the driving managing center 1000 receives a naviscript that a user desires from the terminal 1010 with the receiving unit 1001 (step S 111 ), and the converting unit 1002 converts the received naviscript into structured navigation data (step S 112 ).
  • the coordinating unit 1003 makes a comparison/coordination between the structured navigation data and the data stored in the driving management database 1004 .
  • the coordinating unit 1003 modifies the structured navigation data according to the result of the comparison/coordination, and updates the data stored in the driving management database depending on need (step S 113 ).
  • the inversely converting unit 1005 inversely converts the coordinated structured navigation data into a naviscript (step S 114 ).
  • the transmitting unit 1006 then returns the naviscript to the terminal 1010 (step S 115 ).
  • FIG. 26 shows the flow of the comparison/coordination process (step S 113 of FIG. 24) performed within the driving managing center 1000 .
  • the coordinating unit 1003 sets the instruction cursor to the leading instruction of structured navigation data (step S 121 ), and extracts the contents relating to time, a place, and a route from the instruction indicated by the instruction cursor (step S 122 ).
  • the coordinating unit 1003 sets a data index in the leading data within the driving management database 1004 (step S 123 ), and retrieves the data matching the contents extracted from the instruction while incrementing the data index (step S 124 ).
  • the coordinating unit 1003 determines whether or not matching data exists (step S 125 ). If the coordinating unit 1003 determines that matching data exists, it further determines whether or not there is a room in reservation items of the data (step S 126 ). If the coordinating unit 1003 determines that there is no room in the reservation items, it modifies the extracted contents to those replaceable (step S 127 ). The process then goes back to step S 123 .
  • step S 128 If the coordinating unit 1003 determines that there is a room in the reservation items, it adds one more reservation item of the data (step S 128 ), and copies the instruction indicated by the instruction cursor at the end of the modified structured navigation data (step S 129 ). If no matching data exists in step S 125 , the process goes to step S 129 .
  • the coordinating unit 1003 increments the instruction cursor by 1 (step S 130 ), and determines whether or not the instruction cursor exceeds the last instruction (step S 131 ). If the instruction cursor does not exceed the last instruction, the process goes back to step S 122 . If the instruction cursor exceeds the last instruction, the process is terminated.
  • the present invention is applied to a driving managing system, so that naviscripts are transmitted/received between the driving managing center 1000 and the terminal 1010 .
  • itinerary/route data for reservations in various driving managing systems and point/route data for navigation in various point/route navigating devices can be made common.
  • the driving managing center 1000 makes coordination between a naviscript, which is desired by a user and transmitted from the terminal 1010 , and the driving data managed by the driving management database 1004 , and returns the coordinated naviscript. Consequently, the user can utilize the navigation information in which various items of driving management information are reflected. Namely, the following effects can be promised by applying the present invention to the driving managing system.
  • naviscript is text data that can be described by using a combination of a name which can identify the type of each information and the contents thereof, it can be easily read, written, retrieved, and processed.
  • Anybody can provide and utilize a naviscript by using a network or an electronic medium anywhere at any time.
  • a conventional navigation system or scheduler, etc. never automatically proposes a method for coordinating time.
  • a user cannot move on schedule, he or she must manually retrieve a state or information to reset the schedule.
  • FIG. 27 exemplifies the configuration of the time coordinating system during a move, to which the present invention is applied.
  • This system comprises a scheduler 1200 for managing an action timetable based on a schedule described by a naviscript; an action rule base 1220 ; and a monitor (monitoring/executing device) 1210 for monitoring a current position/time, and for presenting/executing an action of a rule if the rule matching the action rule base 1220 exists.
  • An action to be executed is described according to whether or not there is enough time before an arrival time in the action rule base 1220 .
  • the scheduler 1200 calculates expected arrival times at respective points from a current place to a destination based on an input schedule.
  • the monitor 1210 comprises a current position measuring unit 1211 for measuring the current position of a user; a current time measuring unit 1212 for measuring a current time; a next point expected arrival time calculating unit 1213 for calculating an expected arrival time at each point from the current point; a rule base matching unit 1214 for making a matching between the expected arrival time at the next point and rules within the action rule base 1220 ; and an action executing unit 1215 for executing an action to be executed by a user according to a corresponding rule (depending on whether or not the user reaches by an expected arrival time).
  • a move method from the “i”th point to an “i+1”th point is means_ ⁇ i, i+1 ⁇ ;
  • an expected arrival time at the “i+1”th point is time_ ⁇ i+1 ⁇ ;
  • the information at a current point is point_ ⁇ now ⁇ , which exists between point_ ⁇ i ⁇ and point_ ⁇ i+1 ⁇ without losing generality;
  • the move method from the current point to the “i+1”th point is means_ ⁇ now, i+1 ⁇ ;
  • an expected arrival time at the “i+1”th point is time_ ⁇ i+1, speed ⁇ based on the assumption that the speed of the move method is “speed”.
  • the scheduler 1200 inputs as an initial schedule the information at respective points, the move method to the respective points, the current time (initially, the time at the first point (departure place)), and the speed of the move method (normal/maximum). For example, if a move method is walking, the normal speed of the move method is a speed at which a user walks at a normal pace, while the maximum speed is a speed at which the user walks at a quick pace. If the move method is a train or a bus, there is no need to make a distinction between the normal and the maximum speeds.
  • time_ ⁇ 1 ⁇ indicates that a departure time is specified
  • time_ ⁇ j ⁇ and time_ ⁇ k ⁇ respectively indicate that the arrival times at the “j”th and “k”th points are specified.
  • the scheduler 1200 calculates the expected arrival times at unspecified points among those at the respective points from a departure place to a destination, and transmits the calculated times to the monitor 1210 . Namely, the following times are transmitted in this case.
  • the initial schedule may sometimes be modified by the monitor 1210 . If an input schedule is that modified by the monitor 1210 , the input of the scheduler 1200 includes the information about the respective points from a current point to a destination, a move method to the respective points, the current time, the speed of the move method.
  • the output of the scheduler 1200 includes expected arrival times.
  • the monitor 1210 obtains the current time and point (here, the current point is assumed to be point_ ⁇ now ⁇ existing between point_ ⁇ i ⁇ and point_ ⁇ i+1 ⁇ ) at predetermined time intervals, at predetermined distance intervals and/or for predetermined places with the use of the current time measuring unit 1212 and the current position measuring unit 1211 .
  • the next point expected arrival time calculating unit 1213 calculates an expected arrival time time_ ⁇ i+1,normal-speed ⁇ when the speed of a move method is normal, and an expected arrival time time_ ⁇ i+1,max-speed ⁇ when the speed of the move method is maximum, from the current point to the next point based on the information of the current time and point.
  • the rule base matching unit 1214 makes a matching between the expected arrival time at the calculated normal/maximum speed and the rules within the action rule base 1220 , and extracts a matching rule.
  • the action executing unit 1215 executes the action described in the extracted rule in accordance with the rule.
  • the contents of this rule indicate as follows. If a user moves from the current point to the next point at a normal speed and if the user reaches the next point well ahead of schedule, the action executing unit 1215 displays the message “Well ahead of schedule” on a display screen, etc. based on the action specified by the corresponding rule, and/or vocally outputs the message.
  • the action executing unit 1215 displays and/or vocally outputs the message “Behind time unless going faster”.
  • the action executing unit 1215 displays the message “Behind schedule”, and further displays an inquiry message for prompting the user to select a coordination method after that.
  • the inquiry messages used in this example are menus making the following inquiries.
  • the action executing unit 1215 modifies the subsequent schedule, and transmits the modified schedule to the scheduler 1200 .
  • the scheduler 1200 calculates the expected arrival times at the respective points from the current point to the destination at the normal speed based on the modified schedule, and returns the calculated times to the monitor 1210 . In this case, the expected arrival time at the destination is behind the initial schedule.
  • the action executing unit 1215 modifies the subsequent schedule to that at the maximum speed, and transmits the modified schedule to the scheduler 1200 .
  • the scheduler 1200 calculates the expected arrival times at the respective points from the current point to the destination at the maximum speed, and returns the calculated times to the monitor 1210 .
  • FIG. 28 exemplifies the display screen of the monitor 1210 .
  • This display screen displays scheduled en-route spots and their expected arrival times, a move method, a map, etc. according to the schedule, and also indicates the point at which a user currently stays.
  • the user is currently moving from Nakameguro Station of the subway to Nakameguro Station of the Tokyu Toyoko Line according to the move schedule from Makuhari Building to the Kawasaki Plant, which is shown in FIG. 28.
  • the current time in this situation is displayed.
  • the user is scheduled to take the train which starts from Nakameguro Station of the Tokyu Toyoko Line at 9:34 in this case, it is evident at the current time point (9:40) that the user should miss the train. Therefore, a message indicating that the user is not in time at the next point (the train starting from Nakameguro Station at 9:34) is displayed on the screen according to the corresponding rule within the action rule base 1220 .
  • FIG. 29 shows the flow of the process performed by the scheduler 1200 .
  • the scheduler 1200 inputs a sequence of positions of the respective points from the current point to the destination of the schedule, a sequence of move methods between the respective points, the current time, and the speed types (normal/maximum) of the move methods (step S 201 ).
  • the scheduler 1200 calculates expected arrival times at the respective points from the current point to the destination (step S 202 ), and outputs the expected arrival times at the respective points (step S 203 ).
  • FIG. 30 shows the flow of the process performed by the monitor 1210 .
  • the monitor 1210 repeats the operations performed in steps S 212 to S 214 at predetermined time intervals, at predetermined distance intervals and/or for predetermined places (step S 211 ).
  • the monitor 1210 calculates the expected arrival time at the next point in the case that a user moves at a normal and a maximum speed (step S 212 ).
  • the monitor 1210 then makes a matching between the expected arrival time at the next point and the rules within the action rule base 1220 (step S 213 ), and executes the action of the corresponding rule (step S 214 ).
  • a user sets the route of a sightseeing tour by himself, and obtains the information for guidance corresponding to the attribute of the user at the respective points on the arbitrarily set route, from the navigation information database managing the information for guidance like an explanation of a tour conductor, so that the user can create a navigation plan. Additionally, the created navigation plan is executed by the user portable terminal while the user actually moves on the route, so that the user can enjoy the sightseeing of the individual tour plan even if a tour conductor does not attend.
  • map information such as an intersection name, etc. but navigation information using a landmark such as a building, a signboard, etc., for example, “Turn to the left after passing through a big ⁇ signboard” is adopted, so that the route navigation similar to that made by a man, which is natural and understandable, can be easily created.
  • FIG. 31 exemplifies the configuration of the navigation plan creating and information for guidance managing system.
  • This system comprises a navigation plan creating device 1300 and an information for guidance database managing device 1310 .
  • the information for guidance database managing device 1310 manages an information for guidance database 1311 storing information for guidance attached to a point or time in map data, a schedule, a timetable, a calendar, etc. and a navigation plan.
  • the navigation plan creating device 1300 comprises: an information for guidance attaching unit 1301 for attaching information for guidance to map/schedule data stored in the information for guidance database 1311 and for making an association between them; a condition setting unit 1302 for setting a condition such as a time period during which information for guidance is valid, a valid attribute, etc; a route setting unit 1303 for setting a route of a navigation plan; an information for guidance extracting unit 1304 for extracting information for guidance required for a navigation plan from the information for guidance database 1311 ; and a navigation plan creating unit 1305 for creating a navigation plan based on the extracted information for guidance.
  • a terminal 1320 comprises: a navigation plan executing unit 1321 for executing a received navigation plan according to a user point/time; a presenting unit 1322 for presenting navigation information to the user; current point obtaining unit 1323 for obtaining the current point of the user; and a time measuring unit 1324 for obtaining current time.
  • the terminal 1320 is, for example, a car navigation device, a PC, a PDA, a PHS, a PDC, etc.
  • FIG. 32 summarizes the flow of the processing performed by this system.
  • the information for guidance attaching unit 1301 attaches information for guidance to a point/time of map/schedule data.
  • “attach” means that information for guidance is associated with a particular point on a screen such as a map, a schedule, a calendar, etc.
  • the condition setting unit 1302 sets a time condition such as a time period during which attached information for guidance is valid or an attribute condition such as a user type, etc., and stores a set condition in the information for guidance database 1311 (step S 301 ).
  • the route setting unit 1303 sets a route of a navigation plan. To set a route, points/areas to be included in the route are selected on the map data displayed on a display device, etc., and time/attribute conditions of a navigation plan are further set (step S 302 ).
  • the information for guidance extracting unit 1304 extracts from the information for guidance database 1311 the information for guidance corresponding to the points/areas and the time/attribute conditions, which are set for the route of the navigation plan (step S 303 ), and the navigation plan creating unit 1305 creates a navigation plan by using the extracted information for guidance based on the set route (step S 304 ).
  • the terminal 1320 executes the created navigation plan and presents the navigation information according to the user point or time (step S 305 ).
  • FIGS. 33A and 33B exemplify the process for attaching navigation information.
  • Graphic areas such as rectangles, ellipses, etc. indicate ranges where information for guidance are valid, and can be arbitrarily set by a user.
  • the point to which information for guidance is attached is specified by designating the point and area represented as a graphic such as a rectangle, an ellipse, etc. on the screen displaying the map data 1330 as shown in FIG. 33A, by designating a facility object such as a building, a road, etc. on a map, or by directly describing the place in a naviscript, etc. For example, a range where the Rainbow Bridge is seen is set as a rectangle on the map data 1330 , and the information for guidance 1331 of the Rainbow Bridge is attached thereto.
  • a time slot of the schedule data 1341 in the form of one day is specified, and the information for guidance 1342 and 1343 are attached thereto.
  • a date (one or a couple of dates) of the schedule data 1344 in the form of one month is specified, and the navigation information for guidance 1345 is attached thereto.
  • the information for guidance setting screen 1350 includes an information for guidance input field 1351 for directly inputting information for guidance, an image file name input field 1352 for inputting the name of a file to be used if information for guidance is image data, a voice file name input field 1353 for inputting the name of a voice data file, a reference button 1354 for referencing a specified file, a time condition setting button 1355 for setting a time condition, etc., an OK button 1356 , and a cancel button 1357 .
  • the contents of attached information for guidance are sightseeing navigation information, for example, “This was built in the year ⁇ , and is famous for XX . . .”, etc.
  • the information for guidance may be directly input from the information for guidance input field 1351 , the information created by a travel agent may be used, or the information for guidance may be input by specifying its file name.
  • voice or image information may be attached in addition to text data.
  • a required file name (such as “bbb.jpg”, “aaa.wav”, etc.) is specified in the image file name input field 1352 or the voice file name input field 1353 .
  • Such information for guidance is created also for other specified points or areas in a similar manner.
  • the time condition setting button 55 is clicked to start up the condition setting unit 1302 . Then, a time condition such as a time period during which information for guidance is actually presented, a date, a time period during which information for guidance is valid, etc. are set on another setting screen (not shown). Furthermore, a direction accessing the area to which information for guidance is to be attached, etc. is specified, and also a condition of presenting the information for guidance only when being accessed from a particular direction, etc. may be set.
  • FIG. 35 shows an example of the information for guidance attachment process in the case where the information for guidance 1361 as the sightseeing navigation contents “This was built in the year ⁇ , and is famous for XX . . .” are attached to the point A on the map data 1360 .
  • naviscript An example where the result of the information for guidance attachment process is represented by a naviscript is provided below.
  • the contents of the naviscript mean that the information for guidance is presented as text data “This was built in the year ⁇ , and is famous for XX . . .” within a radius of 1 km from a point A (input and named by a user) at the latitude N35.11.11.111, the longitude E135.22.22.222, and at an address 1-1, ⁇ , ⁇ City, and the voice and the image data stored in the files “aaa.wav” and “bbb.jpg” are output.
  • the information of the latitude, the longitude and the address of the point A may be described by obtaining the data that the map data originally hold.
  • the information for guidance attached to map or schedule data are stored in the information for guidance database 1311 , and managed by the information for guidance managing device 1310 .
  • the information for guidance attached to the map or the schedule data can be displayed as a guidance sheet. By representing information for guidance as a guidance sheet, it becomes easier to verify the correspondence between the contents of attached information for guidance and map/schedule data.
  • FIGS. 36A and 36B exemplify guidance sheets.
  • Guidance sheets 1371 can be displayed for the respective seasons as shown in FIG. 36A, or a guidance sheet 1372 can be displayed in a way such that the sheet varies as time elapses as shown in FIG. 36B. Additionally, a guidance sheet can be displayed for each user attribute such as age, sex, objectives, etc. Also the correspondence between the time-conditional information for guidance and a point to be attached can be displayed as a guidance sheet which represents the correspondence in a three-dimensional space including the time axis shown in FIG. 37.
  • the route setting unit 1301 first specifies a route that a user desires.
  • the route is specified, for example, by using a method with which system automatically retrieves a route and sets the route if a departure point and a destination are specified, a method with which a user selects a point/road on a map screen with the use of a pointing device to set a route, a method for setting a route by correcting an optimum route along a line that a user draws on a map screen, etc.
  • the set route passes through some of the areas on the map data, to which the information for guidance are attached beforehand. Assume that there is map data 1380 on which information for guidance are attached to the areas including the points A to J as shown in FIG. 38A. Also assume that a user specifies the route from a start point “s” to a goal “g”, which is shown in FIG. 38A in order to create a navigation plan on the map data 1380 . At this time, the areas that the route passes through are A, F and J.
  • the information for guidance extracting unit 1304 extracts the information for guidance about the areas A, F and J from the information for guidance database 1311 .
  • the navigation plan creating unit 1305 creates a navigation plan (from the start point “s” to the goal point “g” via the points A, F and J) by using the extracted information for guidance based on the specified route. Then, a naviscript is created, for example, by giving a name “AFJ Tour” to the plan.
  • a navigation plan is represented by a naviscript, so that the navigation plan can be delivered via a network or an electronic medium. As a result, a navigation service that anybody can utilize anywhere can be realized.
  • the initial portion from ⁇ title> to ⁇ /cost> indicates the summary of the entire navigation plan, and the portion from ⁇ par> indicates individual navigation information.
  • the summary is added to endow the sequence of navigation information from ⁇ par> with some meaning, and to facilitate the understanding of the sequence. By way of example, for a sightseeing tour, the contents of navigation can be known at a glance.
  • the navigation plan executing unit 1321 executes the navigation plan according to the current point from the current point information obtaining unit 1323 and the current time from the time measuring unit 1324 .
  • the presenting unit 1322 presents the navigation information in real time.
  • the navigation plan executing unit 1321 executes the portion of the point F of the navigation plan, and presents the information for guidance about the point F to the user. Note that the information for guidance with a condition restricting execution time is not presented when the navigation plan is actually executed if the condition is not satisfied.
  • the navigation plan executing unit 1321 can also simulate a created navigation plan by executing the plan independently of the information from the current point obtaining unit 1323 or the time measuring unit 1324 .
  • the navigation plan creating device 1300 sets an area having predetermined area in the vicinity of a specified route and extracts the information for guidance about points within the area so as to present the information for guidance attached to the points close to the specified route.
  • the route setting unit 1303 expands the route in its periphery as shown in FIGS. 38C and 38D.
  • the information for guidance database 1311 includes a point database 1381 and a schedule database 1382 , which are shown in FIG. 38E.
  • the point database 1381 stores information for guidance such as shops 1391 , restaurants 1392 , parks 1393 , gas stations 1394 , fire hydrants 1395 , etc.
  • the schedule database 1382 stores the information for guidance attached to schedule data.
  • the information for guidance extracting unit 1304 extracts from the point database 1381 the information for guidance about the points included in the area on the expanded route, and the navigation plan creating unit 1305 creates a navigation plan by using the extracted navigation information. In this way, even if information for guidance is attached to a point on a map, a suitable navigation plan can be created.
  • a navigation plan is used for route navigation as another use method.
  • the navigation data which are represented as a big signboard of ⁇ , “a big triangular building”, etc. using landmarks, and attached to the points on the map data 1390 in order to make route navigation for an intersection or a branch road, in the information for guidance database 1311 .
  • the route setting unit 1303 creates a route
  • the information for guidance extracting unit 1304 extracts information for guidance for the respective points on the route.
  • the navigation plan creating unit 1305 creates a navigation plan by combining the information guidance and turn directions, etc.
  • the user carries the terminal 1320 storing the created navigation plan, and starts to walk from the ⁇ Station being the departure point.
  • the navigation plan executing unit 1321 presents the information for guidance attached to the point at the corresponding point on the route.
  • route navigation for example, “Go straight from the Station, pass through the intersection A, proceed obliquely right at the big signboard of ⁇ , and pass the intersection at which the big triangular building exists, so that the destination can be found” is made by presenting the route navigation information at the respective points.
  • route navigation for example, “Go straight from the Station, pass through the intersection A, proceed obliquely right at the big signboard of ⁇ , and pass the intersection at which the big triangular building exists, so that the destination can be found” is made by presenting the route navigation information at the respective points.
  • point and intersection names as conventional, but also route navigation that a user can easily understand can be created and delivered.
  • the navigation information managing device 1310 manages navigation information or navigation plans as a database, so that information for guidance or a navigation plan can be efficiently created and delivered. Furthermore, information for guidance and navigation plans are altogether managed, so that it becomes easy to retrieve, evaluate, etc. information for guidance or a navigation plan.
  • the information for guidance database 1311 together stores navigation data such data requiring almost the same time or cost, routes passing through the same point, data classified according to a user attribute or a season, etc., which leads to an increase in a retrieval efficiency.
  • a navigation plan satisfying the request is created by combining stored navigation plans. Supposing that there are navigation plans for the routes “A, B, C, D, E” and “K, L, C, X, Y, Z”, a navigation plan for a route “A, B, C, X, Y, Z” is created based on the common point C. Still further, a particular navigation plan can be called by attaching a code to information for guidance or a navigation plan, and by inputting the code.
  • this system can easily create a navigation plan, easily access information for guidance having the contents like sightseeing navigation, and make retrieval.
  • a technique relating a portable information system for determining the point data of a user with a GPS receiver for retrieving from a database with a retrieval key information about the data obtained by arbitrarily combining a point, orientation, proceeding direction, eye direction, speed, altitude, date, time, etc., which are obtained from the user point, and for outputting the retrieved information
  • a technique for transmitting information at a particular time is general.
  • This system is intended to process information with an absolute or relative time/point condition like a naviscript, and to present the information only to a user satisfying the condition.
  • FIG. 40 exemplifies the configuration of the system for processing information with a time/point presentation condition.
  • This system comprises: a processing device 1400 having an information retrieving unit 1401 , a target user retrieving unit 1402 , a range condition processing unit 1403 , a relative condition processing unit 1404 , and an information transmitting unit 1405 ; a conditional information database 1410 for storing information with a presentation condition on which a time/point restriction is imposed; a time measuring unit 1411 for measuring time; an estimating module 1412 for estimating a time/point, which is specified relatively in a condition; various sensors 1413 for transmitting various measured values or data; a point information managing system 1415 for identifying the point of users, etc.; and a terminal 1430 .
  • the information retrieving unit 1401 within the processing device 1400 obtains a time from the time measuring unit 1416 , and retrieves the information with the condition corresponding to the time.
  • the target user retrieving unit 1402 retrieves a target user satisfying a condition on which a point restriction is imposed based on the point information obtained from the point information managing system 1415 . If a condition includes time/point range (area) specification, the range condition processing unit 1403 controls the presentation of information within the range according to the specification.
  • the relative condition processing unit 1404 retrieves information with a condition including a relatively specified time/point, and processes the condition based on the data from the estimating module 1412 and the sensors 1413 .
  • the information transmitting unit 1405 transmits information only to a user terminal 1430 which satisfies the condition according to the condition of the retrieved information.
  • the terminal 1430 is, for example, a PDA, a PDC, a PHS, a car navigation system, a mobile PC, a wearable computer, a radio, etc., which comprises a presenting unit 1431 for receiving information from the information transmitting unit 1405 within the processing device 1400 , and for presenting the information.
  • This system stores information on which a time/point restriction is imposed as a presentation condition in the conditional information database 1410 , retrieves information satisfying the condition based on the time measured by the time measuring unit 1411 , selects users satisfying the point restriction of the condition based on the point information from the position information managing system 1415 , and presents the information to the terminal 1430 of the selected users.
  • the conditional information database 1410 stores information with a presentation condition on which a time/point restriction is imposed.
  • the conditional information can be simply described and specified by text in a predetermined format.
  • the conditional information can be created by using a GUI in a similar manner as in the above described navigation plan creating and information for guidance managing system.
  • an index may be automatically created by extracting from existing information a keyword regarding a time or a point. Retrieval, management, usage are facilitated by using a particularly predetermined format like a naviscript.
  • Time specification Presenting all of people at a specified time
  • Point specification Presenting all of people belonging to a specified place.
  • Time and point specification Presenting all of people belonging to a specified point at a specified time.
  • an absolute or a relative range of a time/point is allowed to be specified.
  • the information stored in the conditional information database 1410 includes, for example, event information (such as a concert, a sports game, fireworks, a department store sale, etc.), restaurant information, sightseeing navigation, route navigation, facility navigation, news, weather forecast, television/radio program schedule, traffic information, horoscope, attention alert, a manual, mail, etc.
  • event information such as a concert, a sports game, fireworks, a department store sale, etc.
  • restaurant information sightseeing navigation, route navigation, facility navigation, news, weather forecast, television/radio program schedule, traffic information, horoscope, attention alert, a manual, mail, etc.
  • a time restriction condition may be specified also with a relative time, for example, (3 days after information A:), (1 week after the preceding display:), (3 minutes before the arrival at a point P), etc.
  • a time restriction condition can be specified by using a description such as “till”, “by”, “about”, “in”, “after”, “before”, “since”, “as”, “when”, “while”, “now”, “then”, “once”, “during”, “within”, etc.
  • point range specification such as (a point within a radius of 500 m from a ⁇ facility:), (on a highway ⁇ :), etc. or range specification such as “within a city”, “within a shop”, “within a building”, “a platform of a station”, etc. can be made.
  • Relative specification is made by a function of a matter yet to be determined. At this time its condition is not determined when the specification is described. The condition is determined when the information is actually delivered. Examples of this type of information include (1 km before the point at which you stay at 12:00:), (within a radius of 300 m from a point at which Mr. ⁇ stays:), etc.
  • a point restriction condition can be specified for a movement of a relationship between a man and a point as a derivative from the range specification. Examples include (when approaching Tokyo Station: a timetable of Tokyo Station), (Within Chiba City: sightseeing navigation of Chiba Prefecture), (from Nagoya area: a correspondence table between the Nagoya dialect and the standard Japanese), (toward Hokkaido: weather of Hokkaido), (apart from Tokyo: leisure spots information), etc,
  • a point restriction condition may be specified by using a description such as “at”, “around”, “in”, “to”, “from”, “on”, “near”, “under”, “above”, “up”, “down”, “for”, “toward”, “apart from”, “through”, etc.
  • a condition restricting the number of presentation times or the number of target people can be specified in a time or point condition. For example, specification such as (3 times by May 10:), (with a limitation of 300 persons residing in the district A: (information is transmitted to 3000 persons selected at random from among target people)), etc. is made.
  • the processing device 1400 performs the following process.
  • the information/condition retrieving unit 1401 retrieves conditional information corresponding to a time, which is measured at the predetermined time intervals by the time measuring unit 1411 , from the conditional information database 1410 .
  • the predetermined time interval measured by the time measuring unit 1411 is defined to be the shortest possible time interval allowed in the process performed by the information retrieving unit 1401 , or a prescribed time interval.
  • the time measuring unit 1411 notifies a time every 10 minutes, so that all the information with a specified time condition, which corresponds to this time interval (10 minutes), are retrieved.
  • the retrieval process requires a considerable amount of time, conditions may not be retrieved in real time but retrieved beforehand at prescribed time intervals, and then the information with the corresponding condition may be extracted each time the corresponding time is actually reached.
  • the target user retrieving unit 1402 obtains the current point of users from the position information managing system 1415 , and retrieves the users staying at the point satisfying the place restriction condition within the information retrieved by the information/condition retrieving unit 1401 .
  • a system having a self-position identifying system such as a GPS, etc. is made to periodically transmit its own point, or the point information managing system 1415 of a PHS or a cellular phone, etc. can be used.
  • the information transmitting unit 1405 transmits the retrieved information only to the user terminals 1430 retrieved by the target users retrieving unit 1402 .
  • a method for transmitting information to particular user terminals 1430 a method for specifying the IDs of the receiving terminals 1430 and transmitting information is generally used.
  • another method for transmitting information to terminals 1430 within a particular range/area can be used, for example, by making an output adjustment such that electronic waves reach only the particular range/area in a broadcasting manner.
  • the range condition processing unit 1403 performs the process as follows.
  • the range condition processing unit 1403 sets a presentation flag of this information to ON in order to prevent the information from being transmitted to the same user while the presentation flag is ON.
  • the range condition processing unit 1403 sets the presentation flag to OFF.
  • the information is presented only once or a predetermined number of times.
  • the identical information is presented a plural number of times, it can be also presented periodically at predetermined time intervals while the user stays within the range.
  • the capability of the range condition processing unit may be arranged within the terminal 1430 .
  • the range condition processing unit 1403 adjusts the number of presentation times in consideration of other information so that information is presented a required number of times, which suits the amount of information obtained within a unit time period. For example, if there are already three items of information for the user having the amount of information obtained within a unit time period such as 5 times a day, another item of information is adjusted to be displayed twice.
  • the technique proposed by the Japanese Patent Application No. 10-270672 “Information Presenting Apparatus for Adjusting and Presenting Information and a Method Thereof” can be used. However, since this does not directly relate to the gist of the present invention, its detailed explanation is omitted here.
  • the range condition processing unit 1403 continues to present the information satisfying the range restriction to the presenting unit 1431 as long as the user stays in the range specified by the information.
  • the most recently received information (information A) is overwritten and presented on a presentation screen 1461 as shown in FIG. 41A.
  • all of candidates of the information satisfying the condition are displayed as a menu 1462 as shown in FIG. 41B.
  • Information are switched in turn by pressing a button 1463 , so that required information is displayed. If all the information cannot be assigned to a displayed menu, as shown in FIG. 41C, the rightmost menu button “ ” in the menu 1462 indicates that further information candidates yet to be displayed exist.
  • Information to be presented can be manually selected by pressing the button 1463 also in this case. Additionally, information items to be displayed are allowed to be selected, for example, by displaying the information in an order of recency of the received information items or in descending order of the priorities of the information items.
  • the technique proposed by the Japanese Patent Application No. 10-200237 “Electronic Processing Device Having a Menu Interface” can be used.
  • the information may be presented a prescribed number of times, for example, five times until the end of the running period, on the condition that the amount of information that a user receives has a margin.
  • the information may be presented according to a function which presents the information more frequently as the running period approaches the end.
  • the relative condition processing unit 1404 extracts the information having this condition when the condition including the relatively specified restriction is uniquely determined, and defines the extracted information to be a presentation target. For example, if the time restriction of the presentation condition within information B is “3 days after information A”, whichever day “3 days after” indicates cannot be identified. When the information A is received, the time condition of the information B is determined.
  • the relative condition processing unit 1404 extracts the information B as a presentation target upon receipt of the information A, and presents the information B 3 days after the receipt of the information A.
  • the relative condition processing unit 1404 extracts the information C as a presentation target, and performs the presentation process.
  • the point restriction of the condition within information D is “within a radius of 300 m from the point at which Mr. ⁇ stays”, the information D is extracted as a presentation target while the point of Mr. ⁇ can be identified. The process for presenting the information D is performed during that time period.
  • the relative condition processing unit 1404 applies to the condition the value calculated by the estimating module 1412 for estimating the arrival time at the point P or the point to be stayed at 12:00, and performs the presentation process.
  • the presentation condition of information E is “3 minutes before the arrival at the point P”, the arrival time at the point P is estimated to be 10:00, and the time 9:57 satisfies the time restriction of the condition.
  • the relative condition processing unit 1404 determines that the time restriction of the information E is 9:57.
  • the information/condition retrieving unit 1401 retrieves from the conditional information database 1410 the information E as the information having the time condition corresponding to this time, and presents the retrieved information.
  • the relative condition processing unit 1404 performs the process by applying the point A to the condition, and presents the retrieved information 1 km before the point A in which the point condition is recognized to be satisfied.
  • Such estimation can be made if an action plan is known. Assume that, a user driving a car with a car navigation device predetermines a destination, and drives according to a calculated route. In this case, an arrival time can be estimated according to a speed per hour, jam percentage of a road, etc., and also a point to be stayed at a particular time can be approximately estimated. Additionally, if a user is moving with route information described by a naviscript, it is possible to estimate where the user stays at a specified time even if walking or by train (for example, the scheduler 1200 shown in FIG. 27 can be used).
  • the configuration of this system varies depending on where a time/pomt restriction of time/point-conditional information is recognized to select information and where the information is presented to a user.
  • An example of the configuration of this system is provided below.
  • FIG. 42 shows a first example of the system configuration in the case where time-conditional information is selected and processed on a server side.
  • the server side comprises the information retrieving unit 1401 , the range condition processing unit 1403 , the relative condition processing unit 1404 , the information transmitting unit 1405 , the time measuring unit 1411 , the estimating module 1412 , sensors 1413 in addition to the conditional information database 1410 .
  • the server side extracts the information having a corresponding time condition at predetermined time intervals, and transmits the extracted information to all of user terminals 1430 having an information reception capability.
  • the relative condition process and the range condition process are performed on the server side.
  • the terminals 1430 only present information to users.
  • the range condition processing unit 1403 for performing the range condition process may be comprised on the terminal 1430 side.
  • FIG. 43 shows a second example of the system configuration in the case where time-conditional information is processed on a terminal side.
  • a server side only comprises an information transmitting unit 1405 other than the conditional information database 1410 .
  • the terminal side 1430 comprises the information retrieving unit 1401 , the range condition processing unit 1403 , the relative condition processing unit 1404 , the time measuring unit 1411 , the presenting unit 1431 , and an information buffer 1432 .
  • Time-conditional information transmitted from the server side are received on the terminal 1430 side. The received information are selected based on the time condition, and the selected information is presented to a user.
  • the range condition process or the relative condition process is performed on the terminal 1430 side in this case, a user can set the processing method for each of the conditions.
  • Setting examples include “if “till” is used, notification is made 3 days or 1 day before a specified date”, “displayed only once when entering a range” as the range specification, “displayed only in a determinate case (the estimating module is not used)” as the relative specification, etc. Note that the amount of information to be presented to a user and its timing may be changed by adjusting the amount of information and by assigning the priorities of time specification and the adjustment of the amount of information.
  • FIG. 44 shows the flow of the process performed when time-conditional information is processed.
  • the relative condition process is performed by the relative condition processing unit 1404 (step S 401 ).
  • a time is obtained by the time measuring unit 1411 (step S 402 ), and the information having the condition corresponding to the obtained time is retrieved by the information retrieving unit Ad ⁇ 1401 (step S 403 ).
  • the range condition process is performed for this information by the range condition processing unit 1403 (step S 404 ). Then, the retrieved information is presented to the user terminal 1430 (step S 405 ).
  • Steps S 401 through S 404 are performed on either of the server and the terminal sides, while step S 405 is performed on the terminal side.
  • FIG. 45 shows a third example of the system configuration in the case where point-conditional information is processed on a server side.
  • the server side comprises the target user retrieving unit 1402 , the range condition processing unit 1403 , the relative condition processing unit 1404 , the information transmitting unit 1405 , the estimating module 1412 , sensors 1413 , the point information managing system 1415 in addition to the conditional information database 1410 .
  • the terminal 1430 side only comprises the presenting unit 1431 .
  • the server side transmits information directly to a user at a point or within a range corresponding to a condition, and further transmits corresponding information if it receives new information or if a user range changes.
  • point-conditional information corresponding to a user point may be retrieved based on the user point, and the retrieved information may be provided.
  • a user corresponding to each point condition within the information may be retrieved, and the information may be presented to the user.
  • FIG. 46 shows the flow of the process performed when point-conditional information is processed. If a point condition assigned to information includes relative specification, the relative condition process is performed by the relative condition processing unit 1404 (step S 411 ). The point information of a user is obtained from the point information managing system 1415 (step S 412 ), and a user corresponding to the point restriction is retrieved by the target user retrieving unit 1402 (step S 413 ). If the condition of the retrieved information includes point range specification, the range condition process is performed for this information by the range condition processing unit 1403 (step S 414 ). The information is then presented to the retrieved user (step S 415 ).
  • Steps S 412 through S 414 are operations performed on a server side, while steps S 414 and S 415 are performed on a terminal side.
  • FIG. 47 shows a fourth example of the system configuration in the case where point-conditional information is processed on a terminal side.
  • a server side comprises only the information transmitting unit 1405 other than the conditional information database 1410 .
  • the terminal side 1430 comprises the information retrieving unit 1401 , the range condition processing unit 1403 , the relative condition processing unit 1404 , the presenting unit 1431 , the information buffer 1432 for storing a process result of a relative condition, and a self-point identifying unit 1433 as a replacement of the point information managing system 1415 .
  • the information having the place condition suitable for a user point can be presented when the terminal 1430 side comprises the self-point information identifying unit 1433 (such as a GPS device, and the like). Additionally, the user side can specify a point restriction range. By way of example, “receiving only information within a radius of 1 km from a transmitting source of point-conditional information (the center of a specified range within the information)”, “presenting information only when a corresponding point exists on a route during a move, regardless of a specified point range), etc. are specified.
  • the process of the terminal 1430 in this configuration example is similar to that explained by referring to FIG. 46.
  • the information having a time/point condition can be presented by the presenting unit 1431 when the terminal 1430 side comprises the self-point identifying unit 1433 (such as a GPS device, and the like).
  • FIG. 48 shows the flow of the process performed when a server side processes time point conditional information.
  • the condition assigned to the information includes relative specification
  • the relative condition process is performed by the relative condition processing unit 1404 (step S 421 ).
  • a time is obtained by the time measuring unit 1411 (step S 422 ), and the information having the condition corresponding to the obtained time is retrieved by the information retrieving unit 1401 (sep S 423 ).
  • the point information of users is then obtained from the point information managing system 1415 (step S 424 ), and a user satisfying the point restriction is retrieved by the target user retrieving unit 1402 (step S 425 ).
  • the range condition process is performed by the range condition processing unit 1403 (step S 426 ).
  • the information is transmitted to the terminal 1430 of the selected user, and the selected information is presented by the presenting unit 1431 (step S 427 ).
  • FIG. 49 shows the flow of the process performed when a terminal side selects time/point-conditional information.
  • steps S 431 through S 435 are operations performed on the terminal 1430 side.
  • the relative condition process is performed by the relative condition processing unit 1404 (step S 431 ).
  • the time and the self-point are respectively obtained by the time measuring unit 1411 and the self-position identifying unit 1433 (step S 432 ), and the information having the condition corresponding to the time/point retrieved by the information retrieving unit 1401 is selected.
  • the range condition process is performed for this information by the range condition processing unit 1403 (step S 434 ).
  • the selected information is then presented by the presenting unit 1431 (step S 435 ).
  • FIG. 50 shows a sixth example of the system configuration where time/point-conditional information is processed by a terminal having a self-schedule managing capability.
  • the terminal 1430 side comprises the information retrieving unit 1401 , the range condition processing unit 1403 , the relative condition processing unit 1404 , the time measuring unit 1411 , and the self-point identifying unit 1433 , as units for identifying a time/point condition.
  • the terminal 1430 side further comprises the inputting unit 1434 for inputting as time/point-conditional information also the information of a schedule of a local user or the group to which the user belongs.
  • FIG. 51 shows the flow of the process performed when time/point-conditional information is processed by the terminal having the self-schedule managing capability.
  • the time/point-conditional information transmitted from a server side is received (S 441 ).
  • Schedule information is input from the inputting unit 1434 (step S 442 ).
  • the condition assigned to the information includes relative specification
  • the relative condition process is performed by the relative condition processing unit 1404 (step S 443 ), and a time and a self-point are respectively obtained by the time measuring unit 1411 and the self-point identifying unit 1433 (step S 444 ).
  • the information having the condition corresponding to the obtained time/point is retrieved by the information retrieving unit 1401 (step S 445 ).
  • the range condition process is performed by the range condition processing unit 1403 for this information (step S 446 ).
  • the retrieved information is then presented by the presenting unit 1431 (step S 447 ).
  • the contents of this naviscript mean that the text data “This is built in the year ⁇ , and famous for XX . . .”, the voice data within the file “aaa.wav”, and the image data within the file “bbb.jpg” are presented to the user staying within a range of the radius of 1.0 km from the point at the latitude of N35.11,11,111 and the longitude of E135.22.22.222.
  • Information in the above described explanation means information contents which are significant to the user. However, information is not limited to the above described type of information. Information which is not significant to a user can be processed in a similar manner as a signal passing through a machine.
  • example example-04 — 05
  • time-optimal—items relating to ⁇ point> included in the tag set of ⁇ seq> are rearranged so as to minimize a required time, and the rearranged items are sequentially executed.
  • distance-optimal—items relating to ⁇ point> included in the tag set of ⁇ seq> are rearranged so as to minimize a required distance, and the rearranged items are sequentially executed.
  • cost-optimal—attributes relating to ⁇ route> included in the tag set of ⁇ seq> are determined so as to minimize a required cost, and the result is sequentially executed.
  • This condition means “if the contents of the tag set of ⁇ time> within the tag set to which the ID of Daiba IC is assigned indicate 11:30 or later, and 13:30 or earlier”.
  • the symbols and their meanings used within the attribute “if” are as follows.
  • the left side is equal to the right side.
  • the left side is not equal to the right side.
  • the left side is less than the right side. ( ⁇ : less than)
  • [0600] content the following tag sets or an arbitrary combination of an arbitrary number of them can be included.
  • time indicates a time at which navigation is performed.
  • content a time at which navigation is performed.
  • a time can be specified both absolutely and relatively as follows.
  • Relative time specification “10 minutes before a succeeding instruction.
  • point indicates a point at which navigation is performed. “point” absolutely stipulates a point.
  • location indicates a position at which navigation is performed. “location” relatively stipulates a position.
  • object indicates an object to be navigated such as a building.
  • content the following tag sets can be included. ⁇ name>, ⁇ category>, ⁇ address>, ⁇ zip-code>, ⁇ country>, ⁇ phone>, ⁇ fax>, ⁇ url>, ⁇ e-mail>, ⁇ latitude>, ⁇ longitude>, ⁇ altitude>, ⁇ open>, ⁇ close>, ⁇ reservation>, ⁇ comment>, ⁇ text>, ⁇ voice>, ⁇ audio>, ⁇ image>, ⁇ video>
  • tag sets can be recognized as the elements for stipulating an object such as a facility to be navigated.
  • example restaurant, Italian,
  • duration a time period during which “text” is displayed.
  • duration a time period during which “voice” is output.
  • times the number of times that “voice” is output.
  • example Specialty is . . . made by an Italian chef.
  • src an “audio” file output as one form of object navigation is specified.
  • duration a time period during which “audio” is output.
  • src an “image” file displayed as one form of object navigation is specified.
  • duration a time period during which an “image” is displayed.
  • src a “video” file played as one form of object navigation is specified.
  • duration a time period during which a “video” is played.
  • tag sets can be recognized as the elements for stipulating a route to be navigated. Or, the following content can be written.
  • example walk, bicycle, car, bus, train, ship, plane, . . .
  • example a normal road, a toll road, a highway, an esplanade, . . .
  • ⁇ par> indicates “parallel”. ⁇ par> indicates that included items are executed in parallel. ⁇ par> is defined to be a default setting in a portion below ⁇ info>, and thus ⁇ par> can be omitted.
  • duration a time period during which “text” is displayed.
  • duration a time period during which “voice” is output.
  • src an “audio” file output as one form of navigation is specified.
  • duration a time period during which “audio” is output.
  • duration a time period during which “image” is displayed.
  • src a “video” file reproduced as one form of navigation is specified.
  • duration a time period during which “video” is reproduced.
  • a naviscript is written by using a sequence of instructions which include as a constituent element a presentation time, or both of a presentation time and information for guidance to be output at that time.
  • a naviscript is written by using a sequence of instructions which include a constituent element a point to be reached, or both of a point to be reached and information for guidance to be output at that point.
  • a naviscript is written by using a sequence of instructions which include as a constituent element a presentation time or both of a presentation time and information for guidance to be output at that time, and/or a point to be reached or both of a point to be reached and information for guidance to be output at that place.
  • a naviscript can describe that a plurality of instructions are processed sequentially or in parallel, and that navigation is made in an optimum order of required durations, distances, or costs of a plurality of instructions, or an order specified by a compound combination of the instructions in the above described methods (01), (02), and (03).
  • a naviscript can specify respective times by using a time range obtained by an arbitrary combination of an absolute time like “10:00”, a relative time like “10 minutes after”, or “a time at or before . . . ”, a time at or after . . . ”, “a time before . . . ”, and “a time after . . . ” in the above described methods (01), (03), and (04).
  • a naviscript can specify respective point by using a point range obtained by an arbitrary combination of an absolute point (for example, coordinates such as latitude, longitude, and altitude, or a proper attribute of an object which can indirectly identify a point, such as a name, an address, a telephone number, etc.), a relative point (such as “10 km beyond . . . ”), a point range (such as “within a radius of 10 km”), a point range such as an attribute of abstract concept which can indirectly identify a point (such as a name, an address, a zip code, etc.), or “inside . . . ”, “outside . . . ”, “within . . . ” and “beyond . . . ”) in the above described (02), (03), and (04).
  • an absolute point for example, coordinates such as latitude, longitude, and altitude, or a proper attribute of an object which can indirectly identify a point, such as a
  • a naviscript can specify a route or a track, which is a point transition with time, by using an arithmetic function, a separately defined function, or separately specified data, or an arbitrary combination of the functions and data in the above described methods (02), (03), and (04).
  • a naviscript can specify a condition of whether or not to execute each instruction by describing whether or not a navigation provider/providing apparatus, a navigation user/using apparatus, information about navigation contents, information about a move method, and information about peripheral situation, or their combination is equal to a certain value or belongs to a certain range (set) in the above described methods (01), (02), (03), and (04).
  • a naviscript can describe also a variety of external information such as a facility, an object, an event (such as a concert, an exhibition, etc.), a timetable, etc., which relate to a presentation time, a presentation time and navigation information to be output at that time, a point to be reached, and a point to be reached and information for guidance to be output at that point, by specifying their locations with network addresses, etc. in the above described methods (01), (02), (03), and (04).
  • a naviscript can specify as a navigation outputting means characters, a map, voice, music, an image, a video, light, smell, force, and a movement, or their arbitrary combination in the above described methods (1), (02), (03), and (04).
  • a naviscript can describe the items relating to the summary of navigation in the above described methods (01), (02), (03), and (04).
  • Respective types of move data such as times, places, etc. of a user or a move method, and/or respective types of media data such as voice, music, an image, a video, etc. are sampled, so that a part or the whole of the above described naviscript is semi-automatically generated in a discrete or a continuous manner.
  • a portion or the whole of the naviscript is stored by being assigned a unique number or name which can identify data.
  • a desired naviscript is retrieved with a given keyword using contents, which are obtained by excluding tags included in a portion or the whole of the naviscript, as a target.
  • a desired naviscript is retrieved with a given keyword using only contents, which relate to a particular tag (set) included in a portion or the whole of the naviscript, as a target.
  • a desired naviscript is retrieved by using as keys one or a plurality of tags and the contents relating to the tags in a portion or the whole of the naviscript.
  • Tags of items relating to the summary of the navigation in a portion or the whole of the naviscript are used as the tags referred to in the above described method (3).
  • the naviscript creating system creates a portion or the whole of the naviscript by dividing one or a plurality of naviscripts into respective portions in units of instruction, and/or by merging the portions.
  • the naviscript creating system assists in attaching tags in a portion or the whole of the naviscript based on the specification of the naviscript language by automatically complementing the name of a tag when characters at the beginning of the tag or its abbreviation are input, or by automatically attaching a tag with a selection from a menu.
  • the naviscript creating system arranges a portion or the whole of the naviscript in a hierarchical structure based on the specification of the naviscript language, and displays the naviscript.
  • the naviscript creating system has a parsing capability and a debugging capability, which are intended to check a portion or the whole of the naviscript, to indicate a grammatically erroneous portion to a user, and to automatically modifies the erroneous portion based on the specification of the naviscript language.
  • the naviscript creating system has an inputting unit for inputting from a map information system to a portion or the whole of the naviscript the information about a place including latitude, longitude, altitude, object attributes such as a name, an address, a telephone number, etc., and/or navigation information accompanying the place via a buffer or a file, and an outputting unit for outputting the portion or the whole of the naviscript via a buffer or a file.
  • the naviscript creating system has a loading/saving unit for loading/saving a portion or the whole of the naviscript to/in a local file system and/or a network file system.
  • the naviscript creating system has a storing unit for storing a portion or the whole of the naviscript by using the above described storing methods for the naviscript.
  • Respective types of parameters in a portion or the whole of the naviscript which are required by a naviscript resultant from conversion, are specified by a description within the naviscript, a default value, another specification file, a user menu selection, or their combination.
  • the naviscript is converted into a form of an article in a travel advertisement or an informative magazine.
  • the naviscript is converted into a form of a program or a commercial of television or radio.
  • the naviscript converting system comprises a converting unit for converting a portion or the whole of the naviscript with the above described conversion methods.
  • the navigation instruction executing system executes an instruction extracted from a naviscript with the process algorithm shown in FIGS. 7 and 8 or with the process algorithm a portion of which is omitted.
  • Navigation is performed for a current point, a departure point, en-route spots, a destination, a route, etc., one after another, or according to each instruction in a portion or the whole of the naviscript. Or, navigation is performed for a certain time, distance, or point, or with an input operation or according to an external event.
  • a combination or a switching selection in the above described methods (04) and (05) is specified by a description in a portion or the whole of the naviscript, a default value, another specification file, or a user menu selection, or their combination.
  • the naviscript providing system (a server or a center) provides to a terminal via a network or an electronic medium a naviscript which describes an information sequence of various points and a route such as a recommended date spot, sightseeing course, etc., and navigation information accompanying the information sequence (such as facility information, a right-turn or left-turn directive, etc.), and makes the terminal perform navigation according to the naviscript.
  • the naviscript using system obtains a naviscript being an information sequence describing various spots or a route such as a recommended date spot, sight-seeing course, etc. and navigation information accompanying the information sequence (such as facility information, a right-turn or left-turn directive, etc.) via a network or an electronic medium, and performs navigation according to the obtained naviscript.
  • a naviscript being an information sequence describing various spots or a route such as a recommended date spot, sight-seeing course, etc. and navigation information accompanying the information sequence (such as facility information, a right-turn or left-turn directive, etc.) via a network or an electronic medium, and performs navigation according to the obtained naviscript.
  • the naviscript providing system uses a portion or the whole of the naviscript as the naviscript used in the above described (01) or (02).
  • the naviscript providing system uses a portion or the whole of the naviscript as the naviscript used in the above described (01) or (02).
  • the naviscript providing system and using system are configured by using the above described naviscript conversion methods, the above described naviscript converting system (translator), the above described navigation instruction execution method, the above described navigation instruction executing system (processor), the navigation output method based on the naviscript, and the above described navigation outputting system (browser).
  • the naviscript providing system and using system have a navigation mode and/or a simulation mode as navigation modes.
  • the system obtains the information about an actual current time/point from a state acquiring unit, and performs actual navigation according to the obtained information.
  • the simulation mode the system obtains the information about a virtual current time/point, and performs virtual navigation according to the obtained information.
  • the naviscript providing system and using system have a state generating unit for initializing and stopping a virtual current time or moving a virtual current time forward or backward, and for allowing the forward or the backward moving speed to be changed during simulation in the simulation mode.
  • the naviscript providing system and using system change the contents of navigation, an output method, and/or a display method according to the type of a using device.
  • the naviscript providing system and using system can be used for route navigation, sightseeing navigation and guidance, a delivery plan, a travel plan, traffic control, scheduling, an amusement, a municipal service, etc.
  • the systems can be used for driving management such that a driving management center returns a reserved and modified naviscript by receiving a naviscript describing an itinerary/route that a user desires from an information device such as a car navigation system, and by making a comparison/coordination between the naviscript and the data stored in a driving management database.
  • the systems automatically determine whether or not a user can reach by an arrival time, and proposes an action to be taken by the user, so that a time adjustment during a move can be made when navigation information using a naviscript is presented.
  • the systems attach navigation information to areas on a map, automatically capture the navigation information about a route when a user selects the route on the map, and can easily create a navigation plan with a naviscript obtained by combining navigation information.
  • An information providing system can be configured so that also a range or a relatively specified time/point condition can be processed for the information with the time/point presentation condition, which is described by a naviscript, and suitable information is transmitted to a user or users corresponding to the time/point condition.
  • a navigation service can be provided/used to/by various users offline or online with various types of devices/media at the same time or different times in the same point or different points.
  • a naviscript can be executed, converted, created, edited, divided, merged, changed, modified, copied, deleted, stored, and retrieved. It can be also formed into a database for reuse. Additionally, the naviscript can be carried or transferred by a suitable electronic medium or a network. Furthermore, it can be sold, purchased, issued, received, given, taken, thrown away, picked up, value-added (such as a mileage service), etc.
  • the naviscript can be created and provided by anybody such as a naviscript center, a contents provider, each facility, an individual, a group, etc.
  • the naviscript can be used by various devices/media such as a PC, a car navigation system, PDA, PDC, PHS, an IC card, a prepaid card, a magnetic disk, an optical disk, a bar code, paper, etc.
  • naviscript providing device a PC, etc.
  • a PC, a car navigation system, PDA, PDC, PHS, and the like can be used as the naviscript using device.
  • a naviscript created by a PC, a car navigation system, PDA, PDC, PHS, etc. can be written to an IC card or a prepaid card. Additionally, the naviscript written to the IC card or the prepaid card is read into the car navigation system, the PDA, the PDC, the PHS, etc., so that the instructions for a navigation service can be executed.
  • a copy of the naviscript is transmitted to a corresponding car navigation system at the time of riding in a cab. This is exactly the same at the time of a transfer to another cab. Besides, a narrative service based on navigation information can be received.
  • naviscript being received by a certain PC, car navigation system, PDA, PDC, PHS, etc. can be received also by another PC, car navigation system, PDA, PDC, PHS, etc. Therefore, for example, a plurality of private cars can make the same tour. Namely, one naviscript can be shared by many people.
  • the naviscript can be carried or dealt by an IC card or a prepaid card.
  • an IC card storing a naviscript and electronic money
  • a ticket can be purchased when inserting the IC card into an automatic ticket vending machine, and a seat and also a hotel can be reserved at a ticket center. Since these information are added to the naviscript, navigation is performed not only from the station to the seat but also check-in at the hotel can be easily made.
  • the naviscript is read into a terminal in a room, it becomes possible to make the terminal care for you. For example, “Is it OK to wake you up at ⁇ o'clock tomorrow?”, etc. is uttered according to the schedule.
  • Prepaid cards describing various naviscripts such as a historic spots tour, a route on which a famous person passed, impressed movie scenes tour, etc. can be sold. Also shop advertisements or a movie schedule can be included therein. Furthermore, restaurant information, etc. can be included, which facilitates reservation making of a restaurant. Note that the cards may only include the principal part of a naviscript, and quoted voice or image data may be stored at a source in a network.
  • tissue paper including a naviscript prepaid card is distributed in front of a station, or a name card is formed as a naviscript prepaid card, so that the advertised shop or company can be easily reached.
  • a course introduced by television can be downloaded from the Internet as a naviscript, or a demo tape of a course carried in a magazine or a guide book can be viewed with an input from an attached bar code or CD-ROM.
  • naviscript can be described as text data, it can be written/read to/from paper, plastic, etc.
  • a paper sheet on which a naviscript describing the procedures for reaching a treasure spot can be stored in a particular place like a bank, similar to cash.
  • a naviscript can be applied in various ways. As navigation within a building, a natural world, or a virtual world, for example, indoor navigation using an elevator or an escalator, navigation in a skiing area or a golf course, navigation for going down a stream or for scuba diving, an experience of a simulated sightseeing flight or space trip, navigation in a virtual shopping mall, etc. can be performed.
  • a naviscript allows navigation in the past or in the future.
  • the navigation can be applied to, for example, navigation for Tokaido, an explanation about an invasion, a battle, or a war, an explanation about scenery viewed from the Silk Road or a car/train window in the world, a flashback of a movie such as the Titanic, creation of a logbook or a travel album, etc.
  • navigation with various types of data can be experienced by various types of devices, systems, and media in various places. Additionally, navigation can be virtually received by setting a virtual time or place. Note that the present invention is not limited to navigation for a route such as a road, etc. The present invention can be also applied to navigation in a virtual space/time world, visualization of a travel course of an animal, transportation facilities, a weather satellite, etc., a display of barter, a display of a transmission path of mail, etc.

Abstract

A navigation script includes time and point information for navigation and information for guidance, and describes an instruction sequence which can represent these information in time series in a mark-up language. According to the structured data generated from the navigation script, an instruction corresponding to a current time or point is executed, so that information for guidance to be presented is output.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a navigation information presenting apparatus for providing navigation information of a route, etc. by using a markup language description and a method thereof, and more particularly to a technique which is applied to car navigation systems, personal computers, PDA (Personal Digital Assistant), PDC (Personal Digital Cellular), PHS (Personal Handyphone System), etc., and is available for providing route information or additional information such as route navigation, sightseeing information, a delivery plan, a travel plan, traffic control, scheduling, an amusement, a municipal service, etc. via a network or an electronic medium. [0002]
  • 2. Description of the Related Art [0003]
  • With conventional navigation information services, for example, if a point desired to be navigated is specified, the data about that point are listed and presented in many cases. Additionally, their contents include only the geographic information (such as a point, a route to the point, or some facilities, etc.) [0004]
  • Furthermore, with a conventional car navigation system, when a departure point, a destination, enroute spots, etc. are set, an appropriate route is selected based on map information, and only geographic information is output while a car is running, thus it is impossible to output additional information for navigation and guidance at a particular time and/or point. Still further, the route information from a departure point to a destination, which is set by one car navigation system, cannot be ported to another so as to utilize the information. [0005]
  • Besides, in a conventional navigation information service, a data list relating to a place to be presented appears to be uneasy to understand. For example, it is unclear whether such a data list is either a mere data set or a data sequence described in an order to be executed. Accordingly, it is also unclear how to handle the described data. [0006]
  • Additionally, since the contents of described data include only the information about a specified point, it is impossible to flexibly guide the route reaching that point or en-route spots. By way of example, information for guidance such as “This facility is famous for ◯◯” or “Reaches ◯◯ in 3 more minutes” 3 minutes prior to an arrival at a destination cannot be described and provided on a route., with only destination specification. [0007]
  • Furthermore, because data of the conventional navigation information service are provided in a format which is different depending on each system, navigation information are difficult to be reciprocally used. For instance, although point information services can be provided in a car navigation system or a PHS (Personal Handyphone System), their data cannot be shared by the systems. [0008]
  • SUMMARY OF THE INVENTION
  • An object of the present invention is to provide an apparatus which can provide the information such as points, routes, and facilities, can actually or virtually perform navigation along a route to a certain point as a user moves or time elapses, and can provide various information for guidance in a format which can be shared by various systems or devices, and a method thereof. [0009]
  • A navigation information presenting apparatus in a first aspect of the present invention comprises an inputting unit, a state acquiring and/or generating unit, a processing unit, and a presenting unit, and presents navigation information to a user or users depending on the state. [0010]
  • The inputting unit inputs a navigation script composed of an instruction sequence based on a predetermined specification, which can describe at least time information and/or point information, and various information for guidance to be output according to at least a time and/or point to be presented, where each of the information is described by a set of combinations of a name which can identify a type of the information and the contents thereof. [0011]
  • The state acquiring unit acquires a state including at least a current time and point, and the state generating unit generates a state including at least either of a virtual current time and point. The processing unit processes the instructions described in an input navigation script according to at least the current or virtual time and point obtained by the state acquisition process or the state generation process. The presenting unit outputs navigation information to be output as the instructions are processed, and presents the navigation information to a user or users. [0012]
  • A navigation information presenting apparatus in a second aspect of the present invention comprises a selecting unit and an outputting unit, and presents navigation information to a user or users depending on a state. [0013]
  • The selecting unit dynamically selects navigation information to be presented according to at least time information and/or point information. The outputting unit outputs selected navigation information according to at least a time and a point to be presented. [0014]
  • A driving managing device in a third aspect of the present invention comprises an inputting unit, a driving management database, a coordinating unit, and an outputting unit, and manages driving data. [0015]
  • The inputting unit inputs a navigation script composed of an instruction sequence based on a predetermined specification, which can describe at least time information and/or point information, and information for guidance to be output according to at least a time and/or point to be presented, where the information is described by a set of combinations of a name which can identify a type of the information and the contents thereof. [0016]
  • The driving management database manages the data describing at least time and/or point information, and at least a reservation state and/or a point state. The coordinating unit makes a comparison and coordination between the navigation script input by the inputting unit and the data of the driving management database, and performs the process for modifying the navigation script and the process for updating the data of the driving management database depending on need, according to the result of the comparison and coordination. The outputting unit outputs a resultant navigation script. [0017]
  • A time coordinating device in a fourth aspect of the present invention comprises an inputting unit, a scheduler, a rule base, and a monitoring and executing device, and proposes an action to be executed by a user depending on whether or not the user reaches by an arrival time. [0018]
  • The inputting unit inputs a navigation script composed of an instruction sequence based on a predetermined specification, which can describe at least time information and/or point information, and information for guidance to be output according to at least a time and/or point to be presented, where the information is described by a set of combinations of a name which can identify a type of the information and the contents thereof. [0019]
  • The scheduler schedules arrival times at respective points. The rule base stores the rules which describe actions to be executed depending on whether or not there is sufficient time to an arrival time. The monitoring/executing device checks the arrival times at which subsequent points from the A current point at the current time are reached for at least each of a predetermined time, point, and distance. [0020]
  • A navigation plan creating apparatus in a fifth aspect of the present invention comprises an associating unit, a setting unit, and a creating unit, and creates a navigation plan obtained by combining navigation information. [0021]
  • The associating unit associates the navigation information with one of areas and points of map information. The setting unit sets a route specified on the map information. The creating unit creates a navigation plan by extracting the navigation information associated with the set route. [0022]
  • A navigation information providing apparatus in a sixth aspect of the present invention comprises a managing unit, a retrieving unit, and a providing unit, and provides information to a user. [0023]
  • The managing unit manages the information with a presentation condition relating to a time. The retrieving unit checks the information with the presentation condition for each time step, and retrieves the information which satisfies a time condition. The providing unit provides a user with the information which satisfies the time condition. [0024]
  • A navigation information providing apparatus in a seventh aspect of the present invention comprises a managing unit, an obtaining unit, a retrieving unit, and a providing unit, and provides a user with information. [0025]
  • The managing unit manages the information with a presentation condition relating a place. The obtaining unit obtains the position information of a user. The retrieving unit checks the information with the presentation condition according to the obtained position of the user, and retrieves information which satisfies a place condition. The providing unit provides the user with the information which satisfies the place condition. [0026]
  • A navigation information providing apparatus in an eighth aspect of the present invention comprises a managing unit, an obtaining unit, a retrieving unit, and a providing unit, and provides a user with information. [0027]
  • The managing unit manages the information with a presentation condition relating to a place. The obtaining unit obtains the position information of a user. The retrieving unit checks the information with the presentation condition, and retrieves a user which satisfies a place condition. The providing unit provides the retrieved user with the information with the corresponding presentation condition.[0028]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing the configuration of an apparatus according to the present invention; [0029]
  • FIG. 2 explains the process performed by a script editing unit; [0030]
  • FIG. 3 shows a part of structured navigation data into which a navigation script is converted, in the form of a table; [0031]
  • FIG. 4 shows in the form of a table a part of structured navigation data into which a naviscript is converted, in the form of a table; [0032]
  • FIG. 5 is a flowchart showing the process performed by an operation inputting unit; [0033]
  • FIG. 6 is a flowchart showing the process performed by a script converting unit; [0034]
  • FIG. 7 is a flowchart showing the preparing process performed by an instruction processing unit; [0035]
  • FIG. 8 is a flowchart showing the execution process performed by the instruction processing unit; [0036]
  • FIG. 9 is a flowchart showing the state acquiring process performed by a state acquiring unit; [0037]
  • FIG. 10 is a flowchart showing the information acquiring process performed by the state acquiring unit; [0038]
  • FIG. 11 is a flowchart showing the state preparing process performed by a state generating unit; [0039]
  • FIG. 12 is a flowchart showing the state generating process performed by the state generating unit; [0040]
  • FIG. 13 is a flowchart showing the navigation outputting process performed by a navigation outputting unit; [0041]
  • FIG. 14A shows a target route of a script semi-automatic generation process; [0042]
  • FIG. 14B shows target time series data of the script semi-automatic generation process; [0043]
  • FIG. 15 exemplifies the system configuration in the case where the present invention is applied to a portable personal computer; [0044]
  • FIG. 16 exemplifies a menu screen for retrieving a naviscript; [0045]
  • FIG. 17 exemplifies a screen resultant from the naviscript retrieval; [0046]
  • FIG. 18 exemplifies a screen on which navigation and operations are performed based on a naviscript; [0047]
  • FIG. 19 exemplifies the system configuration in the case where the present invention is applied to a car navigation system; [0048]
  • FIG. 20 exemplifies the system configuration in the case where the present invention is applied to a PHS; [0049]
  • FIG. 21 exemplifies the system configuration in the case where the present invention is applied to a driving managing system; [0050]
  • FIG. 22 exemplifies a naviscript editor screen displayed by a terminal; [0051]
  • FIG. 23 exemplifies a naviscript browser screen displayed by the terminal; [0052]
  • FIG. 24 is a flowchart showing the process performed by the terminal; [0053]
  • FIG. 25 is a flowchart showing the process performed by a driving managing center; [0054]
  • FIG. 26 is a flowchart showing the comparison/coordination process performed by the driving managing center; [0055]
  • FIG. 27 exemplifies the system configuration in the case where the present invention is applied to a time coordinating system during a move; [0056]
  • FIG. 28 exemplifies a monitor display screen; [0057]
  • FIG. 29 is a flowchart showing the process performed by a scheduler; [0058]
  • FIG. 30 is a flowchart showing the process performed by a monitor; [0059]
  • FIG. 31 exemplifies the configuration of a navigation plan creating and information for guidance managing system; [0060]
  • FIG. 32 is a flowchart showing the process for creating a navigation plan; [0061]
  • FIG. 33A shows the process for attaching navigation information to map data; [0062]
  • FIG. 33B shows the process for attaching navigation information to time data; [0063]
  • FIG. 34 exemplifies a navigation information setting screen; [0064]
  • FIG. 35 exemplifies the process for attaching navigation information; [0065]
  • FIG. 36A shows a first example of navigation sheets; [0066]
  • FIG. 36B shows a second example of a navigation sheet; [0067]
  • FIG. 37 exemplifies navigation information display; [0068]
  • FIG. 38A shows a first example of a route; [0069]
  • FIG. 38B shows a second example of the route; [0070]
  • FIG. 38C shows a first expanded route; [0071]
  • FIG. 38D shows a second expanded route; [0072]
  • FIG. 38E shows a point database and a time database; [0073]
  • FIG. 39 exemplifies map data for route navigation; [0074]
  • FIG. 40 exemplifies the configuration of a system for processing information with a time/point condition; [0075]
  • FIG. 41A shows a first example of an information presentation screen; [0076]
  • FIG. 41B shows a second example of the information presentation screen; [0077]
  • FIG. 41C shows a third example of the information presentation screen; [0078]
  • FIG. 42 exemplifies the system configuration in the case where a server side processes information with a time condition; [0079]
  • FIG. 43 exemplifies the system configuration in the case where a terminal side processes information with a time condition; [0080]
  • FIG. 44 is a flowchart showing the process for manipulating information with a time condition; [0081]
  • FIG. 45 exemplifies the system configuration in the case where a server side processes information with a place condition; [0082]
  • FIG. 46 is a flowchart showing the process for manipulating the information with the place condition; [0083]
  • FIG. 47 exemplifies the system configuration in the case where a terminal side processes information with a place condition; [0084]
  • FIG. 48 is a flowchart showing the process for manipulating information with a time/point condition; [0085]
  • FIG. 49 is a flowchart showing the process for manipulating information with a time/point condition performed on the terminal side; [0086]
  • FIG. 50 exemplifies the system configuration in the case where a terminal having a scheduling capability processes information with a condition; and [0087]
  • FIG. 51 is a flowchart showing the process for manipulating information with a time/point condition performed on the terminal side having the scheduling capability. [0088]
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Provided below is the explanation about the details of preferred embodiments according to the present invention, by referring to the drawings. [0089]
  • First of all, examples in which the present invention is applied are provided for ease of understanding of the present invention. [0090]
  • Assume that a user will visit a friend's home on the next Sunday, and he or she has received text information which is referred to as a navigation script in the present invention and created by the friend, and image data, etc. accompanying the navigation script depending on need. On the next Sunday, the user makes his or her portable PC or electronic pocketbook read the received navigation script, carries the PC or the pocketbook, and makes it perform navigation using the navigation script at the station nearest to the friend's home, etc. In this way, the route from the nearest station to the friend's home is displayed on the screen of the portable PC or the electronic pocketbook. At this time, the current position continuously changes and is displayed on the route as the user moves. At a point where the user seems to get lost on the route, voice or image warning is issued by an instruction which is described in the navigation script beforehand. This navigation script can be used also for a car navigation system having the capabilities of the present invention. Therefore, the navigation according to the navigation script created by the friend can be received, also when the user visits the friend's home by car. [0091]
  • Also, the following usage can be made. Here, assume that a user walks around Shibuya Station for a couple of hours. In this case, a navigation script for navigating Shibuya Station for a couple of hours is downloaded from the center providing navigation information via a network. Instructions described in the navigation script are executed by a portable information device, so that a navigation service according to a time and a place can be received. Also restaurant information is automatically displayed around lunch time. The navigation service can be also received by a cellular phone, etc. In this case, the instructions included in the navigation script are executed on a center side, and a center device transmits the navigation information to the cellular phone as voice or text. [0092]
  • Additionally, navigation scripts of recommended sightseeing courses are created and registered to an electronic medium such as a CD-ROM (Compact Disk-Read Only Memory) being attached to a travel magazine as a supplement, a bar-code, etc. A subscriber retrieves his or her desired sightseeing course from the electronic medium with a PC, etc., and executes the instructions included in the retrieved navigation script in a simulation mode. In this case, the navigation information are dynamically displayed as if the subscriber actually walked along the sightseeing course. Furthermore, by executing the navigation script in a navigation mode on the actual course, the information for guidance according to the point where the subscriber actually stays can be viewed. [0093]
  • To realize the above described capabilities, a navigation information presenting apparatus according to the present invention comprises: an inputting unit for inputting a navigation script describing an instruction sequence which can describe at least time and/or point information, and information for guidance to be output according to a time and/or point to be presented, based on a predetermined specification; a unit for acquiring the state of a current time and point, or for generating the state of a virtual current time and point; a unit for processing instructions described in the input navigation script according to the current time and point obtained by the state acquisition or generation process; and a unit for outputting the navigation information to be output while the instructions are processed, and for presenting the navigation information to a user or users. [0094]
  • The navigation script is described, for example, in a markup language which identifies with tags the time information, the point information, the information for guidance, and the other constituent elements of instructions. [0095]
  • Additionally, the navigation script can describe a directive for directing a plurality of instructions to be processed sequentially or in parallel. The unit for processing instructions processes the plurality of instructions sequentially or in parallel according to the above described directive for sequential/parallel instruction processing. [0096]
  • Furthermore, the unit for inputting a navigation script inputs a navigation script specified by a user or users by communicating with an external device which provides the navigation script via a network, and/or by reading the navigation script from a computer-readable electronic medium, and/or from an input device operated by a user or users. [0097]
  • The navigation information presenting apparatus according to the present invention further comprises a unit for parsing an input navigation script and for converting the script into hierarchical and grouped navigation data. The unit for processing instructions processes the instructions represented as structured navigation data. [0098]
  • The unit for outputting navigation information presents to a user or users a part or the whole of a navigation script, such as a current point, a departure point, en-route spots, a destination, and a route one after another or according to each instruction. Additionally, the navigation information is presented to the user or users as texts, maps, voice, images, videos, lights, smell, force, movements, etc. for a specified time, point, distance, input operation, and/or an external event. [0099]
  • As an operating mode of navigation, either a navigation mode or a simulation mode can be selected. Instructions are processed according to the state of an actual current time and/or point in the navigation mode, while the instructions are processed according to the state of a virtual current time and/or point in the simulation mode. As a result, navigation information is presented to a user or users. [0100]
  • A program for realizing the above described units by means of a computer can be stored in a computer-readable portable medium memory or a suitable storage medium such as a semiconductor memory, a hard disk, etc. Also a navigation script can be stored in a computer-readable portable medium memory such as a magnetic disk, an optical disk, an IC card, etc., or a suitable storage medium such as a semiconductor memory, a hard disk, etc. Or, the navigation script may be converted into a bar code, which can be registered to a printed matter. [0101]
  • The navigation script can be created and edited by a normal text editor or a GUI (Graphical User Interface) editor. Or, the script can be semi-automatically generated based on a history of time and point information, which is obtained while moving on a route to be actually navigated. [0102]
  • The above described navigation script has a feature that an instruction sequence relating to time, point, navigation information is described in a markup language based on a predetermined specification, and is easy to be read and written by human beings. Additionally, the navigation script can be created, provided, and used in a format shared by various devices, and is easy to be distributed via a network or an electronic medium, etc. Also its duplication can be made with ease. The navigation script can describe various navigation information related to time and point. The navigation information related to a point and a route, such as “This facility is famous for ◯◯), can be described. Also the navigation information about time, such as “Informed ten minutes prior to arrival” can be described. [0103]
  • FIG. 1 is a block diagram exemplifying the configuration of a navigation information presenting apparatus according to the present invention. According to the present invention, an instruction sequence composed of data (such as text data, image data, voice data, etc.) of time, point, and information for guidance, which are stored in various formats, is described in a markup language description format. [0104]
  • An instruction is a unit of a script composed of navigation information including times (such as a departure time, en-route times, an arrival time, a start time, an end time, etc.), and points (such as a departure point, en-route spots, a destination, an intersection, a transfer point, a facility location, etc.), and one shot or a portion of various media data (a map, text, voice, music, an image, a video, etc.). The instruction is, for example, a directive for outputting voice data (aaa.wav) and image data (xxx.jpg), which explain a point A, at the point A on a certain route. [0105]
  • Such an instruction sequence, which is described in a markup language such as an XML (Extensible Markup Language) (“Extensible Markup Language (XML) 1.0,” World Wide Web Consortium (W3C) Recommendation, REC-xml-19980210, Feb. 10, 1998. http://www.w3.org/TR/1998/REC-xml-19980210) format, is referred to as a navigation script, according to the present invention. The navigation script is abbreviated to a naviscript in the remaining portion of this specification. The naviscript is stored and managed by a [0106] center 40. Or, the naviscript is stored in various media such as a magnetic disk, a CD-ROM, etc., and is read from a user terminal 10.
  • An [0107] operation inputting unit 11 of the user terminal 10 selects a naviscript script from among the naviscripts stored in the center 40 and/or various media 32 via a network accessing unit 12 and/or a medium accessing unit 13 in response to a retrieval request issued by a user, and passes the selected naviscript and/or a naviscript directly input by a user to a script converting unit 14. The script converting unit 14 parses the naviscript, and converts it into structured navigation data. When the user uses the naviscript during an actual move (navigation mode), an instruction processing unit 15 obtains the current state (current time, point, etc.) of the user from the state acquiring unit 16, and complements the route information of the structured navigation data. Navigation information is then output from a navigation outputting unit 18 based on the structured navigation data according to the obtained state.
  • When the user uses a naviscript in a virtual state (simulation mode), the [0108] instruction processing unit 15 obtains a virtual current time and point from a state generating unit 17, and complements the route information of the structured navigation data. Navigation information is then output from the navigation outputting unit 18.
  • Assume that there is a naviscript for navigating, for example, a ◯◯ Tour on a route from Tokyo Station to the Rainbow Bridge via Kyobashi IC (InterChange), and the naviscript describes the following instructions. [0109]
  • (1) An instruction for outputting voice data “Tokyo Station” at Tokyo Station. [0110]
  • (2) An instruction for outputting voice data “Welcome to ◯◯ Tour” 2 minutes after the first data, and displaying image data summarizing the tour. [0111]
  • (3) An instruction for outputting voice data “Kyobashi IC” at Kyobashi IC. [0112]
  • (4) An instruction for outputting voice data “Rainbow Bridge Soon Ahead” 3 km before the Rainbow Bridge. [0113]
  • (5) An instruction for outputting voice data “Rainbow Bridge” at the Rainbow Bridge. [0114]
  • In this case, the [0115] operation inputting unit 11 reads this naviscript from the center 40 via a network 31, etc. and starts to execute the naviscript, according to a user instruction. The script converting unit 14 then generates structured navigation data by converting the naviscript. The instruction processing unit 15 first extracts the descriptions of points and the route, which are included in the instructions, based on the structured navigation data, and displays the summary of the route by referring a database 20 storing map information, etc. Then, the instruction processing unit 15 obtains the current position or time from the state acquiring unit 16 of a GPS, etc., and processes the instructions based on the obtained point or time. As a result, the navigation outputting unit 18 outputs the voice data “Tokyo Station” when the user reaches Tokyo Station, outputs the voice message “Welcome to ◯◯ Tour” 2 minutes later, and displays the image data of the summary of the tour. Furthermore, the navigation outputting unit 18 outputs the voice data “Kyobashi IC” at Kyobashi IC, the voice data “Rainbow Bridge Soon Ahead” 3 km before the Rainbow Bridge, and the voice data “Rainbow Bridge” upon arrival at the Rainbow Bridge. Accordingly, the user can obtain the helpful navigation information at the suitable spots at the suitable timing while moving on the route of the ◯◯ Tour.
  • For such a naviscript, an instruction sequence relating to time, places, and information for guidance is described in a markup language description format. A generated naviscript is easy to be read and written similar to an existing mark up language, thereby facilitating retrieval and processing. Accordingly, the meaning of the data of a naviscript, and whether or not its instruction sequence is described in an order to be executed are made clear to a naviscript generator. [0116]
  • Additionally, it becomes possible to rearrange instruction sequentially or in parallel, to optimize instructions, and to structure data (into hierarchies or groups), etc., so that various navigation information about time and points can be presented. As a result, navigation information can be easily created, modified, etc. [0117]
  • Furthermore, since the naviscript obtained from the [0118] center 40, etc. is converted into structured navigation data corresponding to a local terminal itself, one naviscript can be shared by various devices and systems.
  • Also, it becomes clear to a user that navigation information is presented according to an instruction sequence (a time sequence and/or a point sequence), thereby obtaining navigation information, which is more suitable for a state, at suitable timing. In the navigation mode, navigation information can be obtained on an actual route. Besides, in the simulation mode, the navigation of a certain route can be virtually experienced. [0119]
  • The naviscript can be easily created and edited by using an existing text editor, and a created naviscript can be registered to a center, etc., so that everybody can obtain navigation information everywhere by using the naviscript via a network, etc. [0120]
  • Naviscript Editing
  • FIG. 2 explains the process performed by a [0121] script editing unit 41. Since the naviscript is described in a markup language, it can be edited by using a normal text editor. As shown on a naviscript editing screen 42 illustrated in FIG. 2, a naviscript can be also generated/edited with GUI by editing and inputting a route, etc. on a map with the use of the map information obtained from a map information database 44, and by converting the information on the naviscript editing screen 42 into a naviscript described in a markup language with the use of a translator 43 which converts graphic information such as a map, etc. into text information. The translator 43 has not only a capability for converting a map image into a naviscript, but also a capability for converting a naviscript stored in a buffer/file 45 into information to be displayed on a map. Such a naviscript editing tool can be easily implemented likewise a home page creation tool of the Internet. The editing tool can be utilized not only by the center 40, but also by a personal computer that a general user possesses.
  • Naviscript Summary
  • The naviscript language adopted in this preferred embodiment is a markup language for describing a naviscript, which is newly defined as a subset of an XML (Extensible Markup Language)(“Extensible Markup Language (XML) 1.0,” World Wide Consortium (W3C) Recommendation, REC-xml-19980210, Feb. 10, 1998. http://www.w3.org/TR/1998/REC-xml-19980210) laid down by W3C (World Wide Web Consortium). [0122]
  • In a naviscript, characters enclosed by angle brackets “<” and “>”, such as <inst id=inst-01>, </inst>, <title>, </title> is called a tag. A tag which does not start “</” is called a start tag, while a tag starts with “</” is called an end tag. The start and the end tags are used as a pair, such as <inst id=inst-01> and </inst>, or <title> and </title>. Such a pair is hereinafter referred to as a tag set. Additionally, for example, “id” included in <inst id-inst-01> is defined to be an attribute of a tag, and “inst-01” is defined to be the value of the attribute. [0123]
  • A naviscript is described by a hierarchical structure of tag sets each forming a pair. When a tag set does not exist in a portion enclosed by another tag set, its portion is defined to be the contents of the tag. The naviscript is composed of tags, an attribute, and contents in a naviscript language. [0124]
  • Here, suppose that the following naviscript exists. [0125]
  • <inst>[0126]
  • <time>◯</time>[0127]
  • <info>□</info>[0128]
  • </inst>[0129]
  • In this naviscript, “◯” enclosed by <time> and </time> between <inst> and </inst>, “□” enclosed by <info> and </info> indicate that navigation information about □ is output at a time ◯. Here, “inst” indicates an instruction. Also suppose that the following naviscript exists. [0130]
  • <inst>[0131]
  • <point>◯</point>[0132]
  • <info>□</info>[0133]
  • </inst>[0134]
  • where “◯” enclosed by <point> and </point> between <inst> and </inst>, and “□” enclosed by <info> and </info> indicate that navigation information about □ is output at a place ◯. [0135]
  • Instructions enclosed by <seq> and </seq> perform navigation sequentially, while instructions enclosed by <par> and </par> perform navigation in parallel. Similarly, instructions enclosed by <time-optimal> and </time-optimal> perform navigation in an optimal order of required time. Instructions enclosed by <distance-optimal> and </distance-optimal> perform navigation in an optimal order of a required distance. Instruction enclosed by <cost-optimal> and </cost-optimal> perform navigation in an optimal order of a required cost. [0136]
  • A specific example of a naviscript description is provided below. [0137]
  • <time>12:00</time>[0138]
  • This is an absolute time display “at 12:00”. [0139]
  • Additionally, [0140]
  • <time>+5 sec</time>[0141]
  • is a relative time display “5 seconds after a preceding instruction”. [0142]
  • <time>−10 min</time>[0143]
  • This is a relative time display “10 minutes before a succeeding instruction”. [0144]
  • Furthermore, [0145]
  • <longitude>◯◯</longitude>[0146]
  • <latitude>◯◯</latitude>[0147]
  • are a direct and absolute point display with a coordinate of longitude and latitude. [0148]
  • <name>◯◯</name>[0149]
  • <address>◯◯</address>[0150]
  • <phone>◯◯</phone>[0151]
  • are an indirect and absolute point display with a name, an address, and a telephone number. [0152]
  • <location>+1.0 km</location>[0153]
  • is a relative point display “1 km beyond a preceding point”. [0154]
  • <location>−1.0 km</location>[0155]
  • is a relative point display “1 km before a succeeding point”. [0156]
  • <name>◯◯National Park</name>[0157]
  • <address>◯◯Ward ◯◯Town</address>[0158]
  • <zip>123-4566</zip>[0159]
  • are an indirect point range display with a name, an address, and a zip code. [0160]
  • <route>[0161]
  • <name>[0162] Route 1</name>
  • </route>[0163]
  • is route specification with a name. Meanwhile, [0164]
  • <route src=route−data.dat>[0165]
  • </route>[0166]
  • is route information specification with a data file and [0167]
  • <route func=route−func.fnc>[0168]
  • </route>[0169]
  • is route information specification with a function. [0170]
  • Furthermore, a condition of whether or not to execute each instruction can be described depending on whether or not the information about a navigation user, a move method, an environment, etc. are equal to certain values, or belong to a certain range (set). [0171]
  • For example, the information about a navigation user, which are used for the condition of whether or not to execute an instruction, include the following items: sex, age, date of birth, blood type, single/married/divorced/bereaved, the number of children, family members, address, legal domicile, place of employment, occupation (business category, occupation type, title), height, weight, figure, physical ability, disease, handicap (sense of sight, sense of color, sense of hearing, taste, language, body), character, hobby, liking (liquor, tobacco, having a sweet tooth/drinking, Japanese-style food/Western food, fish/meat, etc.), driving years, accident history, violation history, temperature, blood pressure, pulsation, heart beat rate, brain waves, eyeball movement, driving time, driver, fellow passenger, etc. [0172]
  • The information about a move method include: a type (walking, bicycle, two-wheeled vehicle, car, bus, train, ship, airplane, etc.), position, speed, acceleration, direction, angular velocity, angular acceleration, altitude, residual quantity of gas, light ON/OFF, windshield wiper ON/OFF, room lamp ON/OFF, air conditioner ON/OFF, radio/TV ON/OFF, car navigation system ON/OFF, air flow, sound volume, checking/mobile inspection time, car type, displacement, automobile maker, right/left handle, etc. [0173]
  • The information about an environment include weather (fine/cloudy/rain/snow, rainy season/typhoon), temperature, humidity, atmospheric pressure, probability of rainfall, UV index, photochemical smog index, noise index, traffic jam state, regulation information, accident information, etc. [0174]
  • Additionally, text data, voice data, sound data, image data, video data, etc. can be specified as navigation information as follows. [0175]
  • <text>◯◯</text>[0176]
  • <voice>◯◯</voice>[0177]
  • <sound>◯◯</sound>[0178]
  • [0179]
  • <video>◯◯</video>[0180]
  • Furthermore, a URL (Uniform Resource Locator) can be described as follows. [0181]
  • <inst id=“inst-point-breakwater”>[0182]
  • <point ref=“http://www.naviscript.com/japan/tokyo/odaiba.nav#point-breakwater”></point>[0183]
  • </inst>[0184]
  • Specific Example of a Naviscript
  • A specific example of a naviscript described in a naviscript language is provided next. Since details of the specifications of tags used in the naviscript language will be described in the rear of this specification, please refer them on demand. [0185]
  • <naviscript version=0.3> [0186]
     <title> Rainbow Town Tour </title>
     <version> example-04_05 </version>
     <copyright> All Rights Reserved,
        Copyright (C) FujiLab Ltd. 1998
    </copyright>
    <navi>
     <title> Rainbow Town </title>
     <author>°Fuji Kanko </author>
     <date> 98/09/10 </date>
     <duration> 3hour40min </duration>
     <distance> 95.0km </distance>
     <cost> 1940yen </cost>
     <par>
      <seq>
       <inst ref = “inst-info-opening”> </inst>
       <inst ref = “inst-point-Kaihinmakuhari
    Sta.”>
    </inst>
       <inst ref = “inst-point-Tokyo Sta.”>
    </inst>
       <inst ref = inst-point-Tokyo Sta. Yaesu
    central Exit”> </inst>
       <inst ref = “inst-point-Kyobashi IC”>
    </inst>
       <inst ref = “inst-info-Rainbow Bridge-
    navigation1”> </inst>
       <inst ref = “inst-info-Rainbow Bridge
    navigation2”> </inst>
       <inst ref = “inst-point-Edobashi JC”>
    </inst>
       <inst ref = “inst-point-Daiba IC”> </inst>
       <inst ref = “inst-object-restaurant”>
    </inst>
       <inst ref = “inst-object-cafe”>
    </inst>
       <inst ref = “inst-point-breakwater”>
    </inst>
       <inst ref = “inst-object-FujiSun TV”>
    </inst>
       <inst ref = “inst-point-Tokyo Sta. Yaesu
    Central Exit”> </inst>
       <inst ref = “inst-info-closing”> </inst>
      </seq>
      <seq>
       <inst ref = “inst-info-noon”> </inst>
      </seq>
      </par>
    </navi>
    <inst id = “inst-info-opening”>
      <time> +5sec </time>
      <info>
       <voice> Welcome to Rainbow Town Tour!
    </voice>
      </info>
    </inst>
    <inst id = “inst-point-Kaihinmakuhari Sta.”>
      <point>
       <category> station </category>
       <name> Kaihinmakuhari </name>
      </point>
      <route>
       <means> train </means>
       <category> JR </category>
       <category> Keiyo Line </category>
       <duration> 41min <duration>
       <distance> 31.7km </distance>
       <cost> 540yen </cost>
      </route>
    </inst>
    <inst Id = “inst-point-Tokyo Sta.”>
     <point>
      <category> station </category>
      <name> Tokyo <name>
     </point>
    </inst>
    <inst Id = “inst-point-Tokyo Sta. Yaesu Central Exit”>
     <point>
      <name> Tokyo Sta. Yaesu Central Exit <name>
     <longitude> 133.33.36 </longitude>
      <latitude> 36.2.5 </latitude>
     </point>
     <route>
      <means> car </means>
      <category> Metropolitan Highway </category>
      <cost> 700yen </cost>
     </route>
    </inst>
    <inst id = “inst-point-Kyobashi IC”>
     <point>
      <name> Kyobashi IC <name>
      <longitude> 133.33.36 </longitude>
      <latitude> 36.2.5 </latitude>
     </point>
     <route>
      thesame
     </route>
    </inst>
    <inst id = “inst-info-Rainbow Bridge-navigation1”>
     <time> −10min </time>
     <info>
      <text> Rainbow Bridge in 10 minutes </text>
     </info>
    </inst>
    <inst id = “inst-info-Rainbow Bridge-navigation2”>
     <location> −1.0 km </location>
     <info ref = “object-Rainbow Bridge#info”>
     </info>
    </inst>
    <inst id = “inst-point-Edobashi JC”>
     <point>
      <name> Edobashi JC <name>
     </point>
     <route>
      <name> Rainbow Bridge <name>
      <means> car </means>
      <category> Metropolitan Highway
    </category>
     </route>
    </inst>
    <inst id = “inst-point-Daiba IC”>
     <point ref = “point-Daiba IC”> </point>
     <route> thesame </route>
    </inst>
    <inst id = “inst object restaurant”
     if = “(ref(inst-point-DaibaIC#time) &ge 11:30)&&
      (ref(inst-point-DaibaIC#time) &le 13:30)”>
     <point ref = “object-restaurant”> </point>
     <info>
      <text ref = “object restaurant#text”>
    </text>
      <image ref “object-restaurant#image”>
    </image>
     </info>
    </inst>
    <inst id = “inst-object-cafe”
     if = “(ref(inst-point-DaibaIC#time) &lt 11:30)¦¦
      (ref(inst-point-DaibaIC#time) &gt 13:30)”>
     <object ref = object-cafe> </object>
    </inst>
    <inst id = “inst-point-breakwater”>
     <point ref = “http://www.naviscript.com/japan
      /tokyo/odaiba.nav#point-breakwater”>
    </point>
    </inst>
    <inst id = “inst-object-FujiSun TV”>
     <object>
      <name> FujiSun TV <name>
      <address> 9-9-9, Daiba, Minato Ward, Tokyo
    </address>
     </object>
    </inst>
    <inst id = “inst-point-Tokyo Sta. Yaesu Central Exit”>
     <point>
      <name> Tokyo Sta. Yaesu Central Exit </name>
      <longitude> 133.33.36 </longitude>
      <latitude> 36.2.5 </latitude>
     </point>
    </inst>
    <inst id = “inst-info-closing”>
     <time> +0sec </time>
     <info>
      <par>
       <voice times = 1> Hope you enjoyed this
    tour! </voice>
       <sound src = “sound-bye.wav duration
    = 2min30sec”> </sound>
      </par>
     </info>
    </inst>
    <inst id = “inst info noon”>
     <time> 12:00 </time>
     <info>
      <voice> Now at noon </voice>
     </info>
    </inst>
    <point id = “point-Daiba IC”>
     <name> Daiba IC <name>
     <longitude> 133.37.46 </longitude>
     <latitude> 36.3.5 </latitude>
    </point>
    <object id = “object-Rainbow Bridge”>
     <name> Rainbow Bridge <name>
     <category> bridge </category>
    <info>
      <text> Rainbow Bridge is 125m above sea
    level and 826m in length... </text>
     </info>
    </object>
    <object id = “object-restaurant”>
     <name> Restaurant Fuji </name>
     <category> restaurant </category>
     <category> Italian </category>
     <phone> 987-654-3210 </phone>
     <text> Specialty is...made by Italian chef
    </text>
     <image src = “image-restaurant.jpg”> </image>
    </object>
    <object id = “object-cafe”>
      <name> Cafe Fuji </name>
      <category> cafe </category>
      <phone> 999-999-9999 </phone>
    </object>
    </naviscript>
  • In the first half of the above provided naviscript from <navi>to </navi>, the information of the title, the version, etc. of the naviscript are defined, and the instructions defined by this naviscript are defined. [0187]
  • Next, the contents of the respective instructions are defined after </navi>. For example, the initial instruction “inst-info-opening” instructs a navigation message “Welcome to Rainbow Town Tour” to be vocally output after 5 seconds elapse from a departure time. [0188]
  • Example of Conversion from a Naviscript into Structured Navigation Data
  • A naviscript like the above described one is converted into structured navigation data by the [0189] script converting unit 14. An example of structured navigation data into which the above described naviscript is converted is provided below.
  • EXAMPLE-04-05.dat
  • [0190]
     naviscript.title
      = “Rainbow Town Tour”;
    naviscript.version
      = “example-04_05”;
     naviscript.copyright
      = “All Rights Reserved, Copyright (C)
    FujiLab Ltd. 1998.”;
     naviscript.navi.title
      = “Rainbow Town”;
     naviscript.navi.author
      = “Fuji Kanko”;
     naviscript.navi.date
      = “98/09/10”;
     naviscript.navi.duration
      = “3hour40min”;
     naviscript.navi.distance
      = “95.0km”;
     naviscript.navi.cost
      = “1940yen”;
     naviscript.navi.instlist
      = “par(seq(1,2,3,4,5,6,7,8,9,10,11,12,
    13,14,15),seq(16))”;
     naviscript.navi.inst [1].id
      = “inst-info-opening”;
     naviscript.navi.inst[1].time
      = “+5sec”;
     naviscript.navi.inst[1].info.voice
      = “Welcome to Rainbow Town Tour!”;
     naviscript.navi.inst[2].id
      = “inst-point-Kaihinmakuhari Sta.”;
     naviscript.navi.inst[2].point.category
      = “station”;
     naviscript.navi.inst[2].point.name
      = “Kaihinmakuhari”;
     naviscript.navi.inst[2].route.means
      = “train”;
     naviscript.navi.inst[2].route.category
      = “JR”;
     naviscript.navi.inst[2].route.category
      = “Keiyo Line”;
     naviscript.navi.inst[2].route.duration
      = “41mm”;
     naviscript.navi.inst[2].route.distance
      = “31.7km”;
     naviscript.navi.inst[2].route.cost
      = “540yen”;
     naviscript.navi.inst[3]id
      = “inst-point-Tokyo Sta.”;
     naviscript.navi.inst[3].point.category
      = “station”;
     naviscript.navi.inst [3]point.name
      = “Tokyo”;
     naviscript.navi.inst[4]id
      = “inst-point-Tokyo Sta. Yaesu Central
    Exit”;
     naviscript.navi.inst[4].point.name
      = “Tokyo Sta. Yaesu Central Exit”;
     naviscript.navi.inst[4].point.name
      = “Tokyo Sta. Yaesu Central Exit”;
     naviscript.navi.inst[4].point.longitude
      = “133.33.36”;
     naviscript.navi.inst[4].point.latitude
      = “36.2.5”;
     naviscript.navi.inst[4].route.means
      = “car”;
     naviscript.navi.inst[4].route.category
      = “Metropolitan Highway”;
     naviscript.navi.inst[4].route.cost
      = “700yen”;
     naviscript.navi.inst [5].id
      = “inst-point-Kyobashi IC”;
     naviscript.navi.inst [5].point.name
      = “Kyobashi IC”;
     naviscript.navi.inst [5].point.longitude
      = 133.33.36”;
     naviscript.navi.inst[5].point.latitude
      = “36.2.5”;
     naviscript.navi.inst [5].route.thesame
      = “yes”;
     naviscript.navi.inst[6].id
      = “inst-info-Rainbow Bridge-navigation1”;
    naviscript.navi.inst[6].time
      = “−10mm”;
     naviscript.navi.inst[6].info.text
      = Rainbow Bridge in 10 minutes”;
     naviscript.navi.inst[7].id
      = 11inst-info-Rainbow Bridge-navigation2”;
     naviscript.navi.inst[7].location
      = “−1.0km”
     naviscript.navi.inst[7].info.text
      = ”Rainbow Bridge is 125m above sea level
    and 826m in length,...”;
     naviscript.navi.inst[8].id
      = “inst-point-Edobashi JC”;
     naviscript.navi.inst[8].point.name
      = “Edobashi JC”;
     naviscript.navi.inst[8].route.name
      = “Rainbow Bridge”;
     naviscript.navi.inst[].route.means
      = “car”
     naviscript.navi.inst[8].route.category
      = “Metropolitan Highway”;
     naviscript.navi.inst[9].id
      = “inst-point-Daiba IC”;
     naviscript.navi.inst[9].name
      = “Daiba IC”;
     naviscript.navi.inst[9].longitude
      = “133.37.46”;
     naviscript.navi.inst[9].latitude
      = “36.3.5”;
     naviscript.navi.inst[9].route.thesame
      = “yes”;
     naviscript.navi.inst[10].id
      = “inst-object-restaurant”;
     naviscript.navi.inst[10].if
      = “(ref(inst-point-Daiba IC#time) &ge
    11:30) && (ref(inst-point-Daiba IC#time) &le 13:30)”;
     naviscript.navi.inst[10].point.id
      = “object restaurant”;
     naviscript.navi.inst[10].point.name
      = “Restaurant Fuji”;
     naviscript.navi.inst[10].point.category
      = “restaurant”;
     naviscript.navi.inst[10].point.category
      = “Italian”;
     naviscript.navi.inst[10].point.phone
      = “987-654-3210”;
     naviscript.navi.inst[10].point.info.text
      = “Specialty is...made by Italian chef”;
     naviscript.navi.inst[10].point.info.image.src
      = “image-restaurant.jpg”;
     naviscript.navi.inst[11].id
      = “inst-object-cafe”;
     naviscript.navi.inst[11].if
      = “(ref(inst-point-Daiba IC#time) &lt
    11:30)¦¦(ref(inst-point-Daiba IC#time) &gt 13:30)”;
     naviscript.navi.inst[11].object.id
      = “object-cafe”;
     naviscript.navi.inst[11].object.name
      = “Cafe Fuji”;
     naviscript.navi inst[11].object.category
      = “cafe”;
     naviscript.navi.inst[11].object.phone
      = “999-999-9999”;
     naviscript.navi.inst[12].id
      = inst-point-breakwater”;
     naviscript.navi.inst[12].point.ref
      = “http://www.naviscript.com/japan/tokyo
    /odaiba.nav#point-breakwater”;
     naviscript.navi.inst[13].id
      = “inst-object-FujiSun TV”;
     naviscript.navi.inst[13].object.name
      = “FujiSun TV”;
     naviscript.navi.inst[13].object.address
      = “9-9-91 Daiba.Minato Ward.Tokyo”;
     naviscript.navi.inst[14].id
      = “inst-point-Tokyo Sta. Yaesu Central
    Exit”;
     naviscript.navi.inst[14].point.name
      = “Tokyo Sta. Yaesu Central Exit”;
     naviscript navi.inst[14].point.longitude
      = “133.33.36”;
     naviscript.navi.inst[14].point.latitude
      = “36.2.5”;
     naviscript.navi.inst[15].id
      = “inst-info-closing”;
     naviscript.navi.inst[15].time
      = “+0sec”;
     naviscript.navi.inst[15].infolist
      = “par(1,2)”;
     naviscript.navi.inst[15].info[1],voice.times
      = “1”;
     naviscript.navi.inst[15].info[1].voice
      = “Hope you enjoyed this tour!”;
     naviscript.navi.inst[15].info[1].sound.src
      = “sound-bye.wav”;
     naviscript.navi.inst[15].info[1].sound.duration
      = “2min30sec”;
     naviscript.navi.inst[16].id
      = “inst-info-noon”;
     naviscript.navi.inst[16].time
      = “12:00”;
     naviscript.navi.inst[16].info.voice
      = “Now at noon”;
  • FIGS. 3 and 4 show a portion of the above described structured navigation data in the form of a table. As is easily understood from these figures, the voice navigation “Welcome to Rainbow Town Tour” is first performed. Then, the route navigation from Tokyo Sta. Yaesu Central exit to Daiba IC via Kyobashi IC and Edobashi JC is performed after the train navigation from Kaihinmakuhari Sta. to Tokyo Sta. The Adtext “Rainbow Bridge in 10 minutes” is displayed 10 minutes before a scheduled time at which Edobashi JC is passed through. Additionally, if the arrival time at Daiba IC is between 11:30 and 13:30, the information about restaurants is presented. If the arrival time at Daiba IC is before 11:30 or after 13:30, the information about cafes is presented. Provided next are the explanations about the processes performed by the respective units shown in FIG. 1. [0191]
  • Process Performed by the Operation Inputting Unit 11
  • The [0192] operation inputting unit 11 obtains a naviscript stored in the center 40 or the medium 32, or a naviscript input by a user. The flow of the process performed by the operation inputting unit 11 is shown in FIG. 5. The operation inputting unit 11 accesses the center 40 via the network 31 with the use of the network accessing unit 12, and/or accesses the medium 32 storing naviscripts with the use of the medium accessing unit 13. A desired naviscript is retrieved and selected according to a user instruction, or a naviscript is directly input by a user, so that the operation inputting unit 11 receives the naviscript (step S11). The operation inputting unit 11 passes the received naviscript to the script converting unit 14 (step S12). Although the naviscript itself is received from the medium 32 at this time, an external image file, etc. specified with URL within the naviscript may be sometimes received via the network 31.
  • Process Performed by the Script Converting Unit 14
  • The [0193] script converting unit 14 converts a naviscript described in a markup language into structured navigation data. The flow of the process performed by the script converting unit 14 is shown in FIG. 6. As shown in this figure, the script converting unit 14 receives a naviscript from the operation inputting unit 11 (step S21), converts the received naviscript into structured navigation data (step S22), and passes the data to the instruction processing unit 15 (step S23).
  • The [0194] script converting unit 14 can convert a naviscript into not only structured data referenced by the instruction processing unit 15, but also various types of structured data used by a local system or other devices, etc. Accordingly, it is possible, for example, to make a scheduler display a time instruction after being passed to the scheduler unchanged or after being converted, or to display on a map the information obtained by converting a place instruction into a map description script.
  • Process Performed by the Instruction Processing Unit 15
  • The [0195] instruction processing unit 15 processes the instructions included in structured navigation data according to the current state of a user or a virtually set state for simulation after complementing the information in an unspecified portion of the route information of the structured navigation data received from the script converting unit 14. The instruction processing unit 15 performs the process shown in FIG. 7 as a preparation process of the instruction processing, and further performs the process shown in FIG. 8 as an execution process.
  • In the preparation process of the instruction processing, upon receipt of structured navigation data from the script converting unit [0196] 14 (step S31), the instruction processing unit 15 determines whether the execution mode set by a user is either a navigation mode or a simulation mode (step S32). If the instruction processing unit 15 determines that the execution mode is the navigation mode, it makes the state acquiring unit 16 acquire a state (an actual current time/point), and obtains the state (step S33). Then, the instruction processing unit 15 adds the actual current point to the beginning of the structured navigation data (step S34). The flow then goes to step S35.
  • If the [0197] instruction processing unit 15 determines that the execution mode is the simulation mode, it issues a request to prepare a state to the state generating unit 17 and a further request to generate a state upon completion of the initial request. The instruction processing unit 15 then obtains the state (virtual current time and point) (step S42), and adds the virtual current position to the beginning of the structured navigation data (step S43).
  • Next, the [0198] instruction processing unit 15 attaches a flag indicating “original” to all of the instructions (step S35). This flag is intended to make a distinction between an instruction originally included in a naviscript and an instruction newly added by a complementing process to be described later.
  • Then, the [0199] instruction processing unit 15 complements the information about a place within the structured data (step S36). In this complementing process, an attribute or attributes not described in the naviscript among various attributes such as latitude, longitude, altitude, a name, an address, a phone number, a zip code, etc. are retrieved from the database unit 20 with a described attribute as a key. If only an area is specified, the attribute of a representative spot in the area is retrieved. By way of example, spots representative of Shinjuku Ward, such as Shinjuku Ward Office, Shinjuku Station, etc. or spots representative of Mt. Fuji, such as Top of Mt. Fuji, an entry point of a Mt. Fuji route etc., are retrieved from the database unit 20. If a plurality of retrieval results are obtained, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using an evaluation index. The instruction processing unit 15 describes a retrieved/selected attribute in a corresponding portion of the structured navigation data.
  • Similarly, the [0200] instruction processing unit 15 complements the information about a route within the structured navigation data (a portion where the route is not specified, etc.) (step S37). Here, if a route is not specified in an item of a route, or if a category (such as a normal road, a toll road, a highway, time precedence, distance precedence, straight drive precedence, wider road precedence, etc.) is specified, the instruction processing unit 15 retrieves a route. If a plurality of retrieval results are obtained, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index. Then, the instruction processing unit 15 additionally describes the retrieved/selected route to the corresponding portion of the structured navigation data, and attaches a flag indicating “addition” to the instruction of the route. If the route is entirely specified, the instruction processing unit 15 determines whether this route is either available or unavailable. If the instruction processing unit 15 determines that the route is unavailable, it retrieves another route. If a plurality of retrieval results exist also in this case, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index. The instruction processing unit 15 then modifies the corresponding portion of the structured navigation data by describing the retrieved/selected route therein, and attaches a flag indicating “addition” to the instruction of that route.
  • Next, the [0201] instruction processing unit 15 converts all of relatively specified places into absolutely specified places (step S38), estimates expected arrivals times at the respective places (step S39), and rearranges all of the instructions in time order (step S40). The instruction processing unit 15 then sets an instruction cursor pointing the instruction to be executed next on the initial instruction (step S41). The flow then proceeds to the execution process shown in FIG. 8.
  • In the execution process, as shown in FIG. 8, the [0202] instruction processing unit 15 first determines whether the execution mode is either navigation mode or simulation mode (step S51). If the instruction processing unit 15 determines that the execution mode is the navigation mode, it makes the state acquiring unit 16 acquire a state (an actual current time and point), and obtains the state (step S52). If the instruction processing unit 15 determines that the execution mode is the simulation mode, it requests the state generating unit 17 to generate a state (a virtual current time and point), and obtains the state (step S59).
  • Then, the [0203] instruction processing unit 15 determines whether or not the current position is on the specified route in the instruction indicated by the instruction cursor. If the processing unit 15 determines that the current position is not on the route, it complements the information about the route of the structured navigation data (step S53). When the instruction processing unit 15 complements the information about the route, it retrieves a route from the current position to a near point on the specified route. If a plurality of retrieval results exist, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index. The instruction processing unit 15 then additionally describes the retrieved or selected route to the corresponding portion of the structured navigation data, and attaches a flag indicating “addition” to the instruction of that route.
  • If the [0204] instruction processing unit 15 determines that the execution mode is the navigation mode (step S54), it obtains the information corresponding to the information separately set by the user (step S55). Assuming that the user makes a presetting such that traffic information is used, the instruction processing unit 15 makes the state acquiring unit 16 acquire the traffic information (such as the information about a traffic jam, a traffic regulation, an accident, etc.), and obtains the traffic information (step S55).
  • The [0205] instruction processing unit 15 then complements the information about a route based on the information obtained in step S55, etc. for the route to which the flag indicating “addition” is attached (step,S56). In this complementing process, if the flag of the route specified by the instruction indicated by the instruction cursor is “addition”, if a traffic jam, regulation, or accident occurs at the nearest indispensable point or on the nearest indispensable route, which is included in the instructions subsequent to that indicated by the instruction cursor, and if an automatic route change setting separately made by the user is ON, the instruction processing unit 15 retrieves a route from the current position to the indispensable point or route. If a plurality of retrieval results exist, the instruction processing unit 15 inquires of the user which result to select by using a menu, or selects one of them by using a suitable evaluation index. The instruction processing unit 15 then modifies the corresponding portion of the structured navigation data by describing the retrieved or selected route therein, and attaches a flag indicating “addition” to the instruction of the route.
  • If the time and/or point described in the instruction indicated by the instruction cursor matches an actual current time and point (in the navigation mode) or a virtual current time and point (in the simulation mode) (or the time and/or point is within an area including an error caused by sampling, etc.), the [0206] instruction processing unit 15 passes the navigation information to the navigation outputting unit 18 (step S57), and updates the point of the instruction cursor to that of the next instruction (step S58). The instruction processing unit 15 repeats the above described process until there is no instruction left (step S59).
  • Process Performed by the State Acquiring Unit 16
  • The [0207] state acquiring unit 16 acquires a state such as a current time and point, or various information such as traffic information, etc. The state acquisition process and the information acquisition process, which are performed by the state acquiring unit 16, are respectively shown in FIGS. 9 and 10. As shown in FIG. 9, upon receipt of the request to acquire a state, which is issued from the instruction processing unit 15, the state acquiring unit 16 acquires an actual current time and point, and passes the acquired time and point to the instruction processing unit 15 (step S61). Additionally, as shown in FIG. 10, upon receipt of the request to acquire information from the instruction processing unit 15, the state acquiring unit 16 acquires the information such as traffic information, etc. depending on need by using a suitable communication means, and passes the acquired information to the instruction processing unit 15 (step S62).
  • Process Performed by the State Generating Unit 17
  • The [0208] state generating unit 17 prepares and generates the values required for the simulation mode, such as a virtual current time and point, etc. The state preparation process and the state generation process, which are performed by the state generating unit 17, are respectively shown in FIG. 11 and 12.
  • In the state preparation process performed by the [0209] state generating unit 17, a virtual departure time is set to either of an actual current time and a time that the user separately sets, which is selected by the user or the system (step S71), upon receipt of the request to prepare a state from the instruction processing unit 15, as shown in FIG. 11. Next, the state generating unit 17 performs the process for setting a virtual departure point to a point that the user or the system selects from among an actual current point, a point that the user separately sets (such as a user home), and the initial point that first appears in the structured navigation data (step S72).
  • Next, the [0210] state generating unit 17 sets a virtual time elapse speed to either of a default virtual time elapse speed set by the system and a virtual time elapse speed that the user separately sets, which is selected by the user or the system (step S73). Additionally, a virtual place move speed for each place move means such as walking, a bicycle, a car, etc. is set to either of a default virtual place move speed that the system sets and a virtual place move speed that the user separately sets, which is selected by the user or the system (step S74).
  • Then, a simulation sampling time period is set to either of a default simulation sampling time period that the system sets and a simulation sampling time period that the user separately sets, which is selected by the user or the system (step S[0211] 75). Next, the state generating unit 17 sets the virtual departure time to a virtual current time (step S76), and the virtual departure point to a virtual current point (step S77).
  • In the state generation process performed by the [0212] state generating unit 17, as shown in FIG. 12, upon receipt of the request to generate a state from the instruction processing unit 15, a virtual current time and point are passed to the instruction processing unit 15 (step S81), and the simulation sampling time period is added to the virtual current time, so that the virtual current time is updated (step S82). Furthermore, the virtual current point is updated (step S83). Namely, the virtual current point is updated to a point obtained by being proceeded on the route that the instruction currently being executed specifies, by the distance calculated by multiplying the virtual place move speed of the place moving means that the instruction currently being executed specifies, by the simulation sampling time period. Note that, however, the virtual current point is updated to the end point of the route if it is beyond the range of the route.
  • Process Performed by the Navigation Outputting Unit 18
  • The [0213] navigation outputting unit 18 outputs navigation (information) based on a naviscript. FIG. 13 shows the flow of the process performed by the navigation outputting unit 18. Upon receipt of the request to output navigation from the instruction processing unit 15, the navigation outputting unit 18 outputs the corresponding navigation (step S91).
  • By the way, the above described naviscript is described in an XML. However, an instruction sequence may be described based on an arbitrary specification which is describable as a pair of a name which can identify the type of each information and its contents, instead of the XML. [0214]
  • For example, the above described format of structured data corresponds to one of such specifications. Accordingly, if the instruction sequence is given in the format of structured data initially, it may be input to the [0215] instruction processing unit 15 unchanged.
  • Semi-automatic Generation of a Naviscript
  • As referred to in the explanation of FIG. 2, a naviscript can be edited by a normal text editor or a naviscript editing tool using a [0216] translator 43 equipped with a capability for converting a naviscript into the information of a point or a route on a map. Or, a naviscript can be semi-automatically generated based on a route on which a user actually walks or drives a car.
  • A script semi-automatically generating [0217] unit 19 is intended to semi-automatically generate a naviscript by obtaining times at respective points on a route, position information such as latitude, longitude, etc., and time series data of information for guidance, etc. Since the naviscript is described in a markup language such as the XML, etc. as described above, it can be created by a general-purpose text editor, word processor, etc. By enabling a naviscript to be semi-automatically generated based on a route on which a user actually moves, even a person unfamiliar with a markup language etc. can create the naviscript with ease.
  • FIGS. 14A and 14B explain the process for semi-automatically generating a naviscript. Here, assume to generate a naviscript for a route from a point “A” to a point “C” via a facility “B”, and to actually move along this route by carrying a [0218] user terminal 10. In this case, the script semi-automatically generating unit 19 obtains as time series data the time at a specified point and point information, and a varying time and the point information such as latitude, longitude, etc. from the state acquiring unit 16. Furthermore, the navigation information such as voice data, image data, etc. are inserted in suitable times and/or points according to user instructions.
  • When the time series data shown in FIG. 14B are obtained in this way, a tag is attached to each data, which is then converted into text data. This text data is defined to be a naviscript. With this process, a naviscript like the following can be semi-automatically generated. [0219]
    <naviscript>
     <navi>
      <inst id = “inst-01”>
       <time> 10:00 </time>
       <point>
        <longitude>E132.00.00 </longitude>
        <latitude> N37.11.11 </latitude>
       </point>
       <info>
        <voice src = “aaa.wav”> </voice>
        <image src = “xxx.jpb”> </image>
       </info>
      </inst>
      <inst id = “inst-02”>
       <time> 12:00 </time>
       <point>
        <longitude>E132.22.22 </longitude>
        <latitude> N37.33.33 </latitude>
       </point>
       <info>
        <voice src = “bbb.wav”> </voice>
        <image src = “yyy.gif”> </image>
       </info>
      </inst>
      <inst id = “inst-03”>
       <time> 16:30 </time>
       <point>
        <longitude>E132.44.44 <longitude>
        <latitude> N37.55.55 </latitude>
       </point>
      </inst>
     </navi>
    </naviscript>
    Note that the script semi-automatically
  • generating [0220] unit 19 may be arranged not only in the user terminal 10, but also in the center 40 or a different portable terminal, etc.
  • Example of Applying the Present Invention to a PC
  • Provided next is the explanation about a case where the present invention is applied to a portable PC (Personal Computer). FIG. 15 exemplifies the configuration of this system. A [0221] personal computer 100 corresponds to the user terminal 10 shown in FIG. 1. A Web center 200 corresponds to the center 40 shown in FIG. 1. An input processing unit 131 corresponds to the operation inputting unit 11, the network accessing unit 12, and the medium accessing unit 13, which are shown in FIG. 1. An output processing unit 132 corresponds to the navigation outputting unit 18 shown in FIG. 1. Similarly, a script converting unit 134 corresponds to the script converting unit 14. An instruction processing unit 135 corresponds to the instruction processing unit 15. A time/point generating unit 136 corresponds to the state generating unit 17. A map/information managing unit 140 and a voice data managing unit 150 correspond to the database unit 20. A clock 160 and a GPS (Global Positioning System) 170 or a PHS position detecting unit 180 correspond to the state acquiring unit 16.
  • A [0222] naviscript system 120 for navigation is built, for example, into a Web browser 110 of the personal computer 100 as plug-in software.
  • FIG. 16 shows an example of a home page screen of a naviscript service provided by a Web center. Supposing that the URL of the home page of the naviscript service is specified on the screen of the [0223] Web browser 110, a menu screen like the one shown in FIG. 16 and described in an HTML (HyperText Markup Language) is delivered from the Web center 200, and displayed on the display screen of the personal computer 100.
  • Assume to select Shibuya Ward as an area, to click a retrieval button after selecting “view” as a category, and to transmit a retrieval request to the [0224] Web center 200 via the Web browser 110 on the screen shown in FIG. 16. In this case, the HTML source displaying the screen shown in FIG. 17 is transmitted from the Web center 200, so that the screen shown in FIG. 17 is displayed by the Web browser 110. Here, two naviscript information items are displayed as retrieval results.
  • When a user selects, for example, a course No. 1 on the screen shown in FIG. 17 and clicks the corresponding button, the naviscript for navigating the course from “Tokyo ◯◯ City” to “National Noh Theater” via “□□ Communication Center” is downloaded from the [0225] Web center 200 to the personal computer 100. As a result, the naviscript system 120 is activated.
  • Note that the [0226] naviscript system 120 may be activated by Web browser 110 accessing a file based on a particular file extension (such as “.nav”) given to a naviscript after being downloaded. FIG. 18 exemplifies the screen displayed by a naviscript browser 130. This screen is displayed by being embedded into the screen displayed by the Web browser 110.
  • The naviscript for the course, which starts from “Tokyo ◯◯ City” and is received by the [0227] input processing unit 131 of the naviscript browser 130 from the Web center 200 via the Web browser 110, is passed to the script converting unit 134. The script converting unit 134 converts the naviscript into structured navigation data, and passes the converted data to the instruction processing unit 135. The instruction processing unit 135 performs the following process as a preparation process of instruction execution.
  • First of all, the [0228] instruction processing unit 135 determines whether the execution mode set by a user is either navigation mode or simulation mode. If the instruction processing unit 135 determines that the execution mode is the navigation mode, it issues a request to acquire a state to the clock 160 and the GPS 170 or the PHS position detecting unit 180, obtains an actual current time/point as a state, and adds the instruction for departing from the actual current position to the beginning of the structured navigation data.
  • Next, the [0229] instruction processing unit 135 adds a flag indicating “original” to all of the information items such as a time, a point, a route, navigation information, etc. of each instruction within the structured navigation data, and retrieves from the map/information managing unit 140 an attribute or attributes yet to be described among various attributes relating to a place (such as latitude, longitude, altitude, a name, an address, a telephone number, a zip code, etc.) by using described attributes as keys. For example, if only an area such as “Shinjuku Ward” is specified, the attributes of places representative of this area, such as Shinjuku Station, Shinjuku Ward Office etc. are retrieved. If a plurality of retrieval results exist at this time, the instruction processing unit 135 inquires of a user which result to select by using a menu, etc., or selects one of them by using a suitable evaluation index, and describes the retrieved or selected attribute in the corresponding portion of the structured navigation data (if a plurality of retrieval results exist also in subsequent processes, they are processed in a similar manner). Note that such a complementing process is normally performed for a naviscript created by a user himself. If a naviscript is the one downloaded from a center, it is recognized to describe the navigation information of the entire course from the beginning. Accordingly, this process is omitted.
  • Next, the [0230] instruction processing unit 135 retrieves a route if a route is not specified in the items relating to a route within the structured navigation data or, for example, only a category such as a normal road, a toll road, a highway, time precedence, distance precedence, etc. is specified. If an entire route is specified, the instruction processing unit 135 determines whether the route is either available or unavailable. If the instruction processing unit 135 determines that the route is unavailable, it retrieves a different route. The instruction processing unit 135 additionally describes the retrieved route to the corresponding portion of the instruction, and attaches a flag indicating “addition” to the route.
  • Then, the [0231] instruction processing unit 135 converts all of relatively specified places into absolutely specified places, describes the converted places in corresponding portions of the structured navigation data, estimates expected arrival times at all of the places, and describes the expected time in corresponding portions of the structured navigation data.
  • The [0232] instruction processing unit 135 then rearranges all of the instructions in time order, and sets an instruction cursor on the initial instruction of the structured navigation data.
  • In the meantime, if the [0233] instruction processing unit 135 determines that the execution mode is the simulation mode, it issues a request to prepare a state to the time/point generating unit 136, further issues a request to generate a state upon completion of the process for preparing a state, obtains a virtual current time and point, and adds the instruction for departing from the virtual current point to the beginning of the structured navigation data. Thereafter, a preparation process similar to that in the navigation mode is performed.
  • The [0234] instruction processing unit 135 performs the following process as an execution process. Provided first is the explanation about the case where the execution mode is the navigation mode. If the instruction processing unit 135 determines that the execution mode is the navigation mode, it issues a request to acquire a state to the GPS 170 and the clock, and obtains the information about an actual current time and point.
  • The [0235] instruction processing unit 135 then determines whether or not the current position is on a specified route. If the instruction processing unit 135 determines that the current position is not on the route specified by the instruction indicated by the instruction cursor, it retrieves a route from the current point to a near point on the route, additionally describes the retrieved route to the corresponding portion of the instruction, and attaches a flag indicating “addition” to the route. If a plurality of retrieval results exist, one of them is selected with user specification or by using a suitable evaluation index.
  • For example, if a traffic information use setting that a user separately makes is ON, traffic information (such as a traffic jam, regulation, accident, etc.) is obtained from VICS (Vehicle Information Communication System). Furthermore, if an automatic route change setting that the user separately makes is ON, if the flag of the route specified by the instruction indicated by the instruction cursor is “addition”, and if a traffic jam, regulation, accident, etc. occurs up to the nearest indispensable point or the nearest indispensable route, which the flag indicating “original” is attached to and included in the instructions subsequent to that indicated by the instruction cursor, the [0236] instruction processing unit 135 retrieves a route from the current point to the indispensable point or route. The instruction processing unit 135 then modifies the route by describing the retrieved or selected route in the corresponding portion of the instruction, and attaches a flag indicating “addition” to the modified route. Since the traffic information use setting or the automatic route change setting is normally used when the present invention is applied to a car navigation system, etc., a user turns off this setting when the user moves on foot as assumed in this example.
  • Next, if the time/point described in the instruction indicated by the instruction cursor matches an actual current time/point (or is within a range including an error caused by sampling, etc.), the [0237] instruction processing unit 135 issues a request to output navigation to the output processing unit 132, passes navigation information thereto, and updates the point of the instruction cursor to that of the next instruction. The instruction processing unit 135 repeats the above described process until there is no instruction left.
  • If the execution mode is the simulation mode, the [0238] instruction processing unit 135 issues a request to generate a state to the time/point generating unit 136, obtains a virtual current time/point, and determines whether or not a virtual current point is on a specified route. If the instruction processing unit 135 determines that the virtual current point is not on the route specified by the instruction indicated by the instruction cursor, it retrieves a route to the virtual current point or to a near point, additionally describes the retrieved route in a corresponding portion of the instruction, and attaches a flag indicating “addition” to the route.
  • If the time/point described in the instruction indicated by the instruction cursor matches the virtual current time/point (or is within a range including an error caused by sampling, etc.), the [0239] instruction processing unit 135 issues a request to output navigation to the output processing unit 132, passes navigation information thereto, and updates the point of the instruction cursor to that of the next instruction. The instruction processing unit 135 repeats the above described process until there is no instruction left.
  • Upon receipt of the request to prepare a state from the [0240] instruction processing unit 135, the time/point generating unit 136 sets as a virtual departure time either of an actual current time and a time that the user separately sets, which is selected by the user or the system, sets as a virtual departure point either of an actual current point or a point that the user separately sets, which is selected by the user or the system, and sets as a virtual time elapse speed either of a default virtual time elapse speed that the system sets and a virtual time elapse speed that the user separately sets, which is selected by the user or the system. The time/point generating unit 136 further sets as a virtual place move speed either of a default virtual place move speed that the system sets and a virtual place move speed that the user separately sets, and sets as a simulation sampling time period either of a default simulation sampling time period that the system sets and a simulation sampling time period that the user separately sets, which is selected by the user or the system. Additionally, the time/point generating unit 136 respectively sets the virtual departure time and the virtual departure point to the virtual current time and the virtual current point.
  • Furthermore, the time/[0241] point generating unit 136 performs the following process as the state generation process. Upon receipt of the request to generate a state from the instruction processing unit 135, the time/point generating unit 136 passes the virtual current time/point to the instruction processing unit 135, and updates the virtual current time by adding the simulation sampling time period to the virtual current time. Still further, the time/point generating unit 136 updates the virtual current point to a point proceeded by a distance calculated by multiplying the virtual place move speed of the place moving means specified in the instruction currently being executed by the simulation sampling time period. However, if the calculated point is beyond the range of the route, the time/point generating unit 136 updates the virtual current point to an end point of the route.
  • Upon receipt of the request to output navigation from the [0242] instruction processing unit 135, the output processing unit 132 outputs navigation information according to the instruction indicated by the instruction cursor. As a result, the course navigation to the National Noh Theater is made on the screen shown in FIG. 18.
  • That is, if the execution mode is the navigation mode, the [0243] output processing unit 132 displays the outline of a route and a distance or time required to reach a destination, or performs navigation vocally and/or by displaying a navigation message or image on the screen shown in FIG. 18 according to the instructions proceeding from the current time or point, while a user is moving on the route that the user himself selects.
  • If the execution mode is the simulation mode, the [0244] output processing unit 132 displays a navigation message or image on the screen shown in FIG. 18 or performs voice navigation according to instructions, based on a set virtual current time/point, and virtual time elapse speed.
  • Example of Applying the Present Invention to a Car Navigation System
  • Next, an example of applying the present invention to a navigation system is provided as another preferred embodiment. FIG. 19 exemplifies the system configuration in the case where the present invention is applied to a car navigation system. A [0245] center 210 corresponds to the center 40 shown in FIG. 1. A navigation outputting unit 302 corresponds to the navigation outputting unit 18 shown in FIG. 1, and the remaining portion of an input/output processing unit 301 corresponds to the operation inputting unit 11, the network accessing unit 12, and the medium accessing unit 13 shown in FIG. 1. A script converting unit 303 corresponds to the script converting unit 14. An instruction processing unit 304 corresponds to the instruction processing unit 15. A time/point generating unit 305 corresponds to the state generating unit 17. A map/information managing unit 310 and a voice data managing unit 320 correspond to the database unit 20. A clock 160, a GPS 170, and a VICS 190 correspond to the state acquiring unit 16.
  • The input/[0246] output processing unit 301 specifies with a menu, etc. the course on which a user desires to drive, and issues a retrieval request to the center 210. After the input/output processing unit 301 downloads a desired naviscript from the center 210, it passes the naviscript to the script converting unit 303. The script converting unit 303 converts the naviscript into structured navigation data, and passes the converted data to the instruction processing unit 304. Thereafter, once a navigation start instruction is issued, the instruction processing unit 304 prepares for an instruction process based on the structured navigation data, and executes instructions.
  • There is a significant difference from a conventionally normal car navigation system. That is, with a conventional method, a user must set points and a route by himself. However, according to the present invention, a recommended course, etc. can be downloaded with ease. Additionally, with the conventional method, a route to a specified destination is navigated vocally or by being displayed on a screen in an inflexible manner based on current point information and map information. Meanwhile, according to the present invention, navigation is output according to a naviscript. Therefore, also adequate sightseeing navigation and guidance can be made according to a point or a route to be passed through by a tour such as a bus tour. Furthermore, flexible navigation such as navigation to a restaurant at lunchtime and navigation to a parking lot, etc. can be made on demand. [0247]
  • Especially, with the conventional car navigation system, only navigation in terms of places is made. With a naviscript according to the present invention, also navigation in terms of time can be made according to a current time or an elapsed time. Furthermore, a user can create a naviscript that the user himself or an acquaintance uses, set the created naviscript in a car navigation system, and operate the system. [0248]
  • Example of Applying the Present Invention to a PHS
  • FIG. 20 exemplifies the system configuration in the case where the present invention is applied to a PDC or a PHS. In this case, a [0249] naviscript 510 is built in a center 500. A PHS browser 610 of a PHS 600 comprises an input processing unit 611 and an output processing unit 612.
  • Assume that a user issues a request to retrieve a naviscript to a [0250] Web server 700 to which naviscripts are recorded, by using the PHS browser 610 of the PHS 600 via a PHS browsing server 520 within a PHS center 500. An output processing unit 522 within the PHS browsing server 520 downloads a desired naviscript from the Web server 700, and passes the downloaded naviscript to the input processing unit 521. The input processing unit 521 passes the naviscript to the script converting unit 523. The script converting unit 523 parses the naviscript, and converts it into structured navigation data. When the user uses the naviscript in the navigation mode, the instruction processing unit 524 obtains the current state (current time/point) of the user from a clock 620 and a PHS position detecting unit 630 of the PHS 600, and complements the route information of the structured navigation data. The instruction processing unit 524 then obtains required map/information and voice data from a map/information managing unit 530 and a voice data managing unit 540 based on the structured navigation data according to the state, and passes the obtained information and data to an output processing unit 522. The output processing unit 522 outputs navigation by making it viewable on a display screen of the PHS browser 610, etc. via the PHS browsing server 520.
  • In the simulation mode, the [0251] instruction processing unit 524 complements the route information of the structured navigation data by obtaining a virtual current time/point from a time/point generating unit 525, and outputs navigation by making it viewable on the display screen of the PHS browser 610 in a similar manner as in the navigation mode.
  • Example of Applying the Present Invention to a Driving Managing System
  • Provided next is the explanation about an example of the case where the present invention is applied to a driving managing system. [0252]
  • There is a conventional driving managing system which comprises: an inputting unit for inputting data describing an itinerary of a trip desired by a user, a service timetable, or a route; a driving management database describing reservation states of respective highways and facilities, etc., and data such as a traffic jam on a road or in a parking lot, regulations, accident, weather, etc.; and a coordinating unit for making a comparison and coordination between the data of an input desired itinerary/route and that of the driving management database, for modifying the itinerary/route data on demand according to the result of the comparison/coordination, and/or for updating the data of the driving management database; and an outputting unit for outputting the resultant itinerary/route data. [0253]
  • There is a point/route navigating apparatus, which is another implementation of a conventional technique, for performing various types of navigation in a car navigation system, a PC, a PDA, a PDC, a PHS, etc. Such an apparatus comprises: an inputting unit for inputting a point/route (sequence) desired by a user, and an executing unit for performing navigation according to the input point/route (sequence). [0254]
  • Since the format of the itinerary/route data for a reservation in the conventional driving managing system is different from that of the point/route data for navigation in the conventional point/route navigating apparatus, these data must be separately created, managed, and operated, which causes an inconvenience to a developer, an operator, and a user. Additionally, because a data format is different depending on each driving managing system, many types of data must be created, managed, and operated, which also causes an inconvenience to a developer, an operator, and a user. [0255]
  • By applying the present invention to a driving managing system, the formats of data input for making reservations in various driving managing systems can be made common to those of the data input for performing navigation in various point/route navigating apparatuses. [0256]
  • FIG. 21 exemplifies the configuration in the case where the present invention is applied to a driving managing system. A [0257] driving managing center 1000 and a terminal 1010 used by a user are interconnected by a network, and transmit/receive a naviscript. A driving management database 1004 obtains various information items such as a road state, a reservation state of each facility from various information providing sources 1020, and manages the obtained information. The terminal 1010 itself may be a portable information device, or an information device to be built into a car navigation system, etc.
  • The [0258] driving managing center 1000 comprises: a receiving unit 1001 for receiving a naviscript transmitted from the terminal 1010; a converting unit 1002 for converting a received naviscript into structured navigation data; a coordinating unit 1003 for making a comparison and coordination between the converted structured navigation data and the data stored in the driving management database 1004; an inversely converting unit for inversely converting the coordinated structured navigation data into a naviscript; and a transmitting unit 1006 for returning the converted naviscript to the terminal 1010.
  • The [0259] terminal 1010 comprises: an input processing unit 1011 for inputting a naviscript desired by a user from a source in a network 1030, a medium 1031 such as a CD-ROM, a magnetic disk, etc., or a keyboard, and the like; a transmitting unit 1012 for transmitting the input naviscript to the driving managing center 1000; a receiving unit for receiving the naviscript which is modified and transmitted by the driving managing center 1000; a converting unit 1014 for converting the received naviscript after being modified into structured navigation data which can be executed by the local terminal itself; an execution processing unit 1015 for generating navigation (information) based on the converted structured navigation data; and an output processing unit 1016 for outputting the generated navigation (information).
  • The [0260] input processing unit 1011 within the terminal 1010 obtains from the network 1030 or reads from the medium 1031 such as a CD-ROM, a magnetic disk, etc. a desired naviscript which describes the destination currently being headed and a scheduled itinerary or route of a trip or an operation, or inputs user instruction information with a keyboard.
  • An example of a naviscript that a user desires is provided below. The contents of the naviscript indicate an overnight trip to the Lake Yamanaka, and specify a course which is bound from Numazu to Gotenba on Tomei Highway by car, drops in at Fuji ◯◯ Land, and reaches Lake Yamanaka ◯◯ Lodge for an overnight stay on December 23rd. A required time is scheduled to be 6 hours and 30 minutes. [0261]
    <naviscript version = “0.3”>
     <title> example </title>
     <copyright> All Rights Reserved,
       Copyright (C) FujiLabo Ltd. 1998.
    </copyright>
    <navi>
     <title> Overnight Stay at Lake Yamanaka
    </title>
     <date> 1998/12/23 </date>
     <duration> 6hour30min </duration>
     <distance> 100.0km </distance>
     <cost> 15,000yen </cost>
     <seq>
      <inst ref = “inst-point-Numazu”>
      <inst ref = “inst-point-Gotenba”>
      <inst ref = “inst-point-Gotenba ◯◯
    Intersection”>
      <inst ref = “inst-object-Fuji ◯◯ Land”>
      <inst ref = “inst-object-Lake Yamanaka ◯◯
    Lodge parking lot”>
     </seq>
    </navi>
    <inst id = “inst-point-Numazu”>
     <time>
      10:00
     </time>
     <point>
      <name> Numazu </name>
      <category> IC </category>
     </point>
     <route>
      <means> car </means>
      <name> Tomei Highway </name>
      <category> highway </category>
     </route>
    </inst>
    <inst id = “inst-point-Gotenba”>
     <time>
      10:30
     </time>
     <point>
      <name> Gotenba </name>
      <category> IC </category>
    </point>
     <route>
      <means> car </means>
      <category> prefectural road </category>
     </route>
    </inst>
    <inst id = “inst-point-Gotenba ◯◯ Intersection”>
     <time>
      10:30
     </time>
     <point>
      <name> Gotenba ◯◯ Intersection </name>
      <category> intersection </category>
     </point>
     <route>
      <means> car <means>
      <name> Route 136 </name>
      <category> national road </category>
     </route>
    </inst>
    <inst id = “inst-object-Fuji ◯◯ Land”>
     <time>
      11:00
     </time>
     <object>
      <name> Fuji ◯◯ Land <name>
      <category> amusement park </category>
     </object>
     <route>
      thesame
     </route>
    </inst>
    <inst id '2 “inst-object-Lake Yamanaka ◯◯ Lodge
    parking lot”>
     <time>
      16:00
     </time>
     <object>
      <name> Lake Yamanaka ◯◯ Lodge parking
    lot </name>
      <category> parking lot </category>
     </object>
    </inst>
    </naviscript>
  • When the naviscript that the user desires is input, the [0262] transmitting unit 1012 transmits this naviscript to the driving managing center 1000.
  • Within the [0263] driving managing center 1000, the receiving unit 1001 receives the naviscript transmitted from the terminal 1010, and the converting unit 1002 converts the received naviscript into structured navigation data.
  • Assume that the following data are stored in the [0264] driving management database 1004 within the driving managing center 1000. Data1 and data2 exemplify route data, data3 exemplifies point data, and data4 and data5 exemplify facilities data. A maximum number of cars/people indicates the maximum number of cars or people that can utilize a road or a facility. A reserved number of cars/people indicates the number of cars/people that make reservations. Jam percentage 100% represents the state where the reserved number of cars/people that make reservations reaches the maximum number, and no more use is allowed.
    <data1>
     main number: 1234567890-01
     subnumber: 19981218-14301500
     type 01: route
     type 02: highway
     name: Tomei Highway
     section: Numazu-Gotenba
     direction: up
     date: 1998/12/18
     time: 14:30-15:00
     reserved number of cars: 1,234
     maximum number of cars: 5,000
     jam percentage: 27[]
     traffic jam: 0[km]/0[min]
     regulation: none
     accident: none
     weather: light rain
     temperature: 15[C]
     wind velocity: 5[m]
    <data2>
     main number: 1234567890-01
     subnumber: 19981223-10001030
     type01: route
     type02: highway
     name: Tomei Highway
     section: Numazu-Gotenba
     direction: up
     date: 1998/12/23
     time: 10:00-10:30
     reserved number of cars: 5,000
     maximum number of cars: 5,000
     jam percentage: 100[]
     traffic jam:
     regulation:
     accident:
     weather:
     temperature:
     wind velocity:
    <data3>
     main number: 2345678901-01
     subnumber: 19981223-10301100
     type 01: point
     address: ◯◯ Gotenba City,
    Shizuoka Prefecture
     name: Gotenba ◯◯ Intersection
     direction: up on Route 136
     date: 1998/12/23
     time: 10:30-11:00
     traffic jam:
     regulation:
     accident:
     weather:
     temperature:
     wind velocity:
    <data4>
     main number: 4567890123
     subnumber: 19981223
     type 01: facility
     type 02: amusement park
     name: Fuji ◯◯ Land
     address: ◯◯, Shizuoka Prefecture
     date: 1998/12/23
     time: 10:00-20:00
     reserved number of people: 1,000
     maximum number of people: 1,000
     jam percentage: 100[]
     weather:
     temperature:
     wind velocity:
    <data5>
     main number: 3456789012
     subnumber: 19981223
     type 01: parking lot
     address: ◯◯, Shizuoka Prefecture
     name: Lake Yamanaka ◯◯ Lodge
    parking lot
     date: 1998/12/23
     time: 15:00-10:00 (next day)
     reserved number of cars: 11
     maximum number of cars: 20
     jam percentage: 55[]
     ...
  • The [0265] coordinating unit 1003 makes a comparison and coordination between the structured navigation data received from the converting unit 1002 and the data such as the states of highways, the states of facilities, a traffic jam on a road or in a parking lot, a regulation, an accident, weather, which are described in the driving management database 1004, modifies the structured navigation data based on the result of the comparison and coordination, and/or updates the data stored in the driving management database 1004.
  • For example, the coordinating [0266] unit 1003 makes a comparison between the above described naviscript that the user desires and <data2> stored in the driving management database 1004, determines that the jam percentage of Tomei Highway is already 100% and this highway is unavailable, and also determines that the jam percentage of the reservation state of Fuji ◯◯ Land is 100% and admission is not allowed. Therefore, the coordinating unit 1003 modifies the corresponding portion of the converted structured navigation data, and/or changes the data of the driving management database 1004.
  • In this case, the contents are modified so that a national road is used instead of Tomei Highway, and Fuji ◯◯ Land is not dropped in. In the above described examples of the naviscript and the data of the [0267] driving management database 1004, the following portion (indicated by the corresponding description of the naviscript) of the structured navigation data
    <route>
    <means> car </means>
    <name> Tomei Highway </name>
    <category> highway </category>
    </route>
    is modified as follows:
    <route>
    <means> car </means>
    <name> Route 246 </name>
    <category> national road </category>
    </route>
    Additionally, the following portion is deleted.
    <inst id = “inst-object-Fuji ◯◯ Land”>
    <time>
    11:00
    </time>
    <object>
    <name> Fuji ◯◯ Land </name>
    <category> amusement park </category>
    </object>
    <route>
    thesame
    </route>
    </inst>
  • The inversely converting [0268] unit 1005 converts the structured navigation data modified by the coordinating unit 1003 into a naviscript. An example of the naviscript which is modified by the coordinating unit 1003 and inversely converted by the inversely converting unit 1005 is provided below. The following naviscript indicates the course from Numazu to the Lake Yamanaka for an overnight stay via Gotenba ◯◯ Intersection by using Routes 246 and 136.
    <naviscript version = “0.3”>
    <title> example </title>
    <copyright> All Rights Reserved,
    Copyright (C) FujiLabo Ltd. 1998 </copyright>
    <navi>
    <title> Lake Yamanaka Overnight Stay </title>
    <date> 1998/12/23 </date>
    <duration> 6hour30min </duration>
    <distance> 100.0km </distance>
    <cost> 15,000yen </cost>
    <seq>
    <inst ref = “int-point-Numazu”>
    <inst ref = “inst-point-Gotenba”>
    <inst ref = “inst-point-Gotenba ◯◯
    Intersection”>
    <inst ref = “inst-object-Fuji ◯◯ Land”>
    <inst ref = “inst-object-Lake Yamanaka ◯◯
    Lodge parking lot”>
    </seq>
    </navi>
    <inst id = “int-point-Numazu”>
    <time>
    10:00
    </time>
    <point>
    <name> Numazu </name>
    <category> IC </category>
    </point>
    <route>
    <means> car </means>
    <name> Route 246 </name>
    <category> national road </category>
    </route>
    </inst>
    <inst id = “inst-point-Gotenba”>
    <time>
    10:30
    </time>
    <point>
    <name> Gotenba </name>
    <category> IC </category>
    </point>
    <route>
    <means> car </means>
    <category> prefectural road </category>
    </route>
    </inst>
    <inst id = “inst-point-Gotenba ◯◯ Intersection”>
    <time>
    10:30
    </time>
    <point>
    <name> Gotenba ◯◯ Intersection </name>
    <category> intersection </category>
    </point>
    <route>
    <means> car </means>
    <name> Route 136 </name>
    <category> national road </category>
    </route>
    </inst>
    <inst id = “inst-object-Lake Yamanaka ◯◯ Lodge
    parking lot”>
    <time>
    16:00
    </time>
    <object>
    <name> Lake Yamanaka ◯◯ Lodge Parking Lot
    </name>
    <category> parking lot </category>
    </object>
    </inst>
    </naviscript>
  • The [0269] transmitting unit 1006 returns the converted naviscript after being modified to the terminal 1010. Within the terminal 1010, the receiving unit 1013 receives the naviscript transmitted from the driving managing center 1000, the converting unit 1014 converts the received naviscript into structured navigation data, the execution processing unit 1015 generates navigation (information) based on the structured navigation data, and the output processing unit 1016 outputs the generated navigation information with a display, a printer, a speaker, etc.
  • FIG. 22 exemplifies a naviscript GUI editor screen displayed on the [0270] terminal 1010. An editor screen 1100 is used to create a naviscript that a user desires, and comprises an operation menu 1101 for selecting and instructing various types of editing operations, and a map operation icon 1102 for moving a display area on a map displayed on a map display area 1103. The user creates his or her desired naviscript by using the GUI which links with the map information displayed in the map display area 1103. The naviscript is represented as a tree structure composed of a course summary portion 1104, a course details portion (navi-tag portion) 1105, (each) instruction portion (inst-tag portion) 1106, etc., and is displayed on the left-hand side of the map display area 1103.
  • An example of a naviscript browser screen displayed on the terminal [0271] 1010 is illustrated in FIG. 23. The browser screen 1110 is displayed when the naviscript after being modified, which is transmitted from the driving managing center 1000, is verified in the simulation mode or executed in the navigation mode. By way of example, areas such as an information display area 1112 for displaying position information at each point, a map display area 1113 for displaying a current position and route of a user on a map, a latitude/longitude display area 1114, a text display area 1115 for displaying text data information included in navigation information, and an image display area 1116 for displaying image data information included in navigation information, and a map moving button 1121 for moving a display area on a map, a reduction scale changing button 1122 for changing a reduction scale of a map display, a various-types setting button 1123 for setting simulation mode, navigation mode, etc., a simulation start button 1124, a fast-forward button 1125, an end button 1126, a reset button 1127, etc. are arranged on the browser screen 1110. As a user moves, also the position of an icon indicating the current point on a map displayed in the map display area 1113 moves, associated navigation information is displayed in the image display area 1116, etc., and voice navigation information is sometimes output from a speaker.
  • FIG. 24 shows the flow of the process performed by a terminal when the present invention is applied to a driving managing system. When the terminal [0272] 1010 inputs a naviscript that a user desires with the input processing unit 1011 (step S101), the transmitting unit 1012 transmits the naviscript to the driving managing center 1000 (step S102). Additionally, the terminal 1010 receives a modified naviscript from the driving managing center 1000 with the receiving unit 1013 (step S103), and the converting unit 104 converts the received naviscript into structured navigation data to be used in the local terminal 1010 with the use of the converting unit 1014 (step S104). The execution processing unit 1015 then creates navigation information based on the structured navigation data (step S105), and the output processing unit 1016 outputs the generated navigation information (step S106).
  • FIG. 25 shows the flow of the process performed by a driving managing center when the present invention is applied to the driving managing system. The [0273] driving managing center 1000 receives a naviscript that a user desires from the terminal 1010 with the receiving unit 1001 (step S111), and the converting unit 1002 converts the received naviscript into structured navigation data (step S112). The coordinating unit 1003 makes a comparison/coordination between the structured navigation data and the data stored in the driving management database 1004. The coordinating unit 1003 then modifies the structured navigation data according to the result of the comparison/coordination, and updates the data stored in the driving management database depending on need (step S113). The inversely converting unit 1005 inversely converts the coordinated structured navigation data into a naviscript (step S114). The transmitting unit 1006 then returns the naviscript to the terminal 1010 (step S115).
  • FIG. 26 shows the flow of the comparison/coordination process (step S[0274] 113 of FIG. 24) performed within the driving managing center 1000. The coordinating unit 1003 sets the instruction cursor to the leading instruction of structured navigation data (step S121), and extracts the contents relating to time, a place, and a route from the instruction indicated by the instruction cursor (step S122).
  • The [0275] coordinating unit 1003 sets a data index in the leading data within the driving management database 1004 (step S123), and retrieves the data matching the contents extracted from the instruction while incrementing the data index (step S124). The coordinating unit 1003 determines whether or not matching data exists (step S125). If the coordinating unit 1003 determines that matching data exists, it further determines whether or not there is a room in reservation items of the data (step S126). If the coordinating unit 1003 determines that there is no room in the reservation items, it modifies the extracted contents to those replaceable (step S127). The process then goes back to step S123. If the coordinating unit 1003 determines that there is a room in the reservation items, it adds one more reservation item of the data (step S128), and copies the instruction indicated by the instruction cursor at the end of the modified structured navigation data (step S129). If no matching data exists in step S125, the process goes to step S129.
  • Next, the coordinating [0276] unit 1003 increments the instruction cursor by 1 (step S130), and determines whether or not the instruction cursor exceeds the last instruction (step S131). If the instruction cursor does not exceed the last instruction, the process goes back to step S122. If the instruction cursor exceeds the last instruction, the process is terminated.
  • As described above, the present invention is applied to a driving managing system, so that naviscripts are transmitted/received between the [0277] driving managing center 1000 and the terminal 1010. As a result, itinerary/route data for reservations in various driving managing systems and point/route data for navigation in various point/route navigating devices can be made common. Additionally, the driving managing center 1000 makes coordination between a naviscript, which is desired by a user and transmitted from the terminal 1010, and the driving data managed by the driving management database 1004, and returns the coordinated naviscript. Consequently, the user can utilize the navigation information in which various items of driving management information are reflected. Namely, the following effects can be promised by applying the present invention to the driving managing system.
  • (a) The itinerary/route data for reservations in various driving managing systems and the point/route data for navigation in various point/route navigating devices can be made common. [0278]
  • (b) Since a naviscript is text data that can be described by using a combination of a name which can identify the type of each information and the contents thereof, it can be easily read, written, retrieved, and processed. [0279]
  • (c) Anybody can provide and utilize a naviscript by using a network or an electronic medium anywhere at any time. [0280]
  • (d) Both actual navigation at a moving site and simulated navigation at a user home or company can be made. [0281]
  • Example of Applying the Present Invention to a Time Coordinating System During a Move
  • Provided next is the explanation about an example of applying the present invention to a time coordinating system during a move. [0282]
  • A conventional navigation system or scheduler, etc. never automatically proposes a method for coordinating time. When a user cannot move on schedule, he or she must manually retrieve a state or information to reset the schedule. [0283]
  • Assume that a user must move to a point A by a time T so as to attend the meeting held at the point A. In this case, if there is a device which proposes or warns the user each time by automatically determining whether or not the user can walk from the current point C to the point A in time at a normal pace or at a quick pace, or whether or not the user cannot walk to the point A in time even at a quick pace, etc., it is very helpful to the user. By applying the present invention to such a time coordinating system during a move, the above described objective can be achieved. Namely, a new schedule can be proposed to a user or a schedule can be automatically reset by automatically retrieving a state or information. [0284]
  • FIG. 27 exemplifies the configuration of the time coordinating system during a move, to which the present invention is applied. This system comprises a [0285] scheduler 1200 for managing an action timetable based on a schedule described by a naviscript; an action rule base 1220; and a monitor (monitoring/executing device) 1210 for monitoring a current position/time, and for presenting/executing an action of a rule if the rule matching the action rule base 1220 exists. An action to be executed is described according to whether or not there is enough time before an arrival time in the action rule base 1220.
  • The [0286] scheduler 1200 calculates expected arrival times at respective points from a current place to a destination based on an input schedule. The monitor 1210 comprises a current position measuring unit 1211 for measuring the current position of a user; a current time measuring unit 1212 for measuring a current time; a next point expected arrival time calculating unit 1213 for calculating an expected arrival time at each point from the current point; a rule base matching unit 1214 for making a matching between the expected arrival time at the next point and rules within the action rule base 1220; and an action executing unit 1215 for executing an action to be executed by a user according to a corresponding rule (depending on whether or not the user reaches by an expected arrival time).
  • The flow of the process performed by this system is explained below. First of all, assume that: [0287]
  • the information at an “i”th point is point_{i}; [0288]
  • a move method from the “i”th point to an “i+1”th point is means_{i, i+1}; [0289]
  • an expected arrival time at the “i+1”th point is time_{i+1}; and [0290]
  • a flag for setting the speed of the move method to a normal speed (normal-speed) or a maximum speed (max-speed). [0291]
  • Also assume that: [0292]
  • the information at a current point is point_{now}, which exists between point_{i} and point_{i+1} without losing generality; [0293]
  • the move method from the current point to the “i+1”th point is means_{now, i+1}; and [0294]
  • an expected arrival time at the “i+1”th point is time_{i+1, speed} based on the assumption that the speed of the move method is “speed”. [0295]
  • The [0296] scheduler 1200 inputs as an initial schedule the information at respective points, the move method to the respective points, the current time (initially, the time at the first point (departure place)), and the speed of the move method (normal/maximum). For example, if a move method is walking, the normal speed of the move method is a speed at which a user walks at a normal pace, while the maximum speed is a speed at which the user walks at a quick pace. If the move method is a train or a bus, there is no need to make a distinction between the normal and the maximum speeds.
  • The information that the [0297] scheduler 1200 inputs as the initial schedule is as follows.
  • point_{[0298] 1}, point_{2}, . . . , point_{n},
  • means_{[0299] 1,2}, means_{2,3}, . . . , means_{n−1,n},
  • time_{[0300] 1}, . . . , time_{j}, . . . , time_{k}, . . .
  • speed [0301]
  • where time_{[0302] 1} indicates that a departure time is specified, and time_{j} and time_{k} respectively indicate that the arrival times at the “j”th and “k”th points are specified. The scheduler 1200 calculates the expected arrival times at unspecified points among those at the respective points from a departure place to a destination, and transmits the calculated times to the monitor 1210. Namely, the following times are transmitted in this case.
  • time_{[0303] 2}, time_{3}, . . . , time_{n}
  • The initial schedule may sometimes be modified by the [0304] monitor 1210. If an input schedule is that modified by the monitor 1210, the input of the scheduler 1200 includes the information about the respective points from a current point to a destination, a move method to the respective points, the current time, the speed of the move method.
  • Namely: [0305]
  • point_{now}, point_{i+1}, . . . , point_{n}, [0306]
  • means_{now,i+1}, means_{i+1,i+2}, . . . , means_(n−1,n), [0307]
  • time_{now}, [0308]
  • speed [0309]
  • The output of the [0310] scheduler 1200 includes expected arrival times.
  • time_{i+1}, time_{i+2}, time_{n}[0311]
  • The [0312] monitor 1210 obtains the current time and point (here, the current point is assumed to be point_{now} existing between point_{i} and point_{i+1}) at predetermined time intervals, at predetermined distance intervals and/or for predetermined places with the use of the current time measuring unit 1212 and the current position measuring unit 1211.
  • The next point expected arrival [0313] time calculating unit 1213 calculates an expected arrival time time_{i+1,normal-speed} when the speed of a move method is normal, and an expected arrival time time_{i+1,max-speed} when the speed of the move method is maximum, from the current point to the next point based on the information of the current time and point.
  • The rule [0314] base matching unit 1214 makes a matching between the expected arrival time at the calculated normal/maximum speed and the rules within the action rule base 1220, and extracts a matching rule. The action executing unit 1215 executes the action described in the extracted rule in accordance with the rule.
  • An example of the rules stored in the [0315] action rule base 1220 is provided below.
  • if (time_{i+1,normal}<=time_{i+1}) [0316]
  • action(output(Well ahead of schedule)) [0317]
  • if (time_{i+1,max}<=time_{i+1}<time_{i+1,normal}) [0318]
  • action(output(Behind schedule unless going faster)) [0319]
  • if (time_{i+1}<time_{i+1,max}) [0320]
  • action(output(Behind schedule)); [0321]
  • ask($1:Rescheduling at normal speed?; [0322]
  • $2:Rescheduling at maximum speed?; [0323]
  • &3:Canceling this schedule?); [0324]
  • if ($1) action(schedule(point_{now}, . . . , [0325]
  • point_{n}, [0326]
  • means_{now,i+1}, . . . , [0327]
  • means_{n−1,n}, [0328]
  • time_{now}, [0329]
  • normal-speed)); [0330]
  • if ($2) action(schedule(point_{now}, . . . , [0331]
  • point_{n}, [0332]
  • means_{now,i+1}, . . . , [0333]
  • means_{n−1,n}, [0334]
  • time_{now}, [0335]
  • max-speed)); [0336]
  • if ($3) action(clear-schedule);) [0337]
  • The contents of this rule indicate as follows. If a user moves from the current point to the next point at a normal speed and if the user reaches the next point well ahead of schedule, the [0338] action executing unit 1215 displays the message “Well ahead of schedule” on a display screen, etc. based on the action specified by the corresponding rule, and/or vocally outputs the message.
  • If the user can reach the next point by the expected arrival time not at the normal speed but at the maximum speed, the [0339] action executing unit 1215 displays and/or vocally outputs the message “Behind time unless going faster”.
  • Additionally, if the user cannot reach the next point even at the maximum speed, the [0340] action executing unit 1215 displays the message “Behind schedule”, and further displays an inquiry message for prompting the user to select a coordination method after that. The inquiry messages used in this example are menus making the following inquiries.
  • “Rescheduling at a normal speed?”, [0341]
  • “Rescheduling at a maximum speed?”, [0342]
  • “Canceling this schedule?”[0343]
  • If the user selects the message “Rescheduling at a normal speed”, the [0344] action executing unit 1215 modifies the subsequent schedule, and transmits the modified schedule to the scheduler 1200. The scheduler 1200 calculates the expected arrival times at the respective points from the current point to the destination at the normal speed based on the modified schedule, and returns the calculated times to the monitor 1210. In this case, the expected arrival time at the destination is behind the initial schedule.
  • If the user selects the message “Rescheduling at the maximum speed”, the [0345] action executing unit 1215 modifies the subsequent schedule to that at the maximum speed, and transmits the modified schedule to the scheduler 1200. The scheduler 1200 calculates the expected arrival times at the respective points from the current point to the destination at the maximum speed, and returns the calculated times to the monitor 1210.
  • If the user selects the message “Canceling this schedule”, the [0346] action executing unit 1215 clears the schedule information, and cancels the schedule.
  • FIG. 28 exemplifies the display screen of the [0347] monitor 1210. This display screen displays scheduled en-route spots and their expected arrival times, a move method, a map, etc. according to the schedule, and also indicates the point at which a user currently stays. The user is currently moving from Nakameguro Station of the subway to Nakameguro Station of the Tokyu Toyoko Line according to the move schedule from Makuhari Building to the Kawasaki Plant, which is shown in FIG. 28. Also the current time in this situation is displayed. Although the user is scheduled to take the train which starts from Nakameguro Station of the Tokyu Toyoko Line at 9:34 in this case, it is evident at the current time point (9:40) that the user should miss the train. Therefore, a message indicating that the user is not in time at the next point (the train starting from Nakameguro Station at 9:34) is displayed on the screen according to the corresponding rule within the action rule base 1220.
  • FIG. 29 shows the flow of the process performed by the [0348] scheduler 1200. The scheduler 1200 inputs a sequence of positions of the respective points from the current point to the destination of the schedule, a sequence of move methods between the respective points, the current time, and the speed types (normal/maximum) of the move methods (step S201). The scheduler 1200 calculates expected arrival times at the respective points from the current point to the destination (step S202), and outputs the expected arrival times at the respective points (step S203).
  • FIG. 30 shows the flow of the process performed by the [0349] monitor 1210. The monitor 1210 repeats the operations performed in steps S212 to S214 at predetermined time intervals, at predetermined distance intervals and/or for predetermined places (step S211). The monitor 1210 calculates the expected arrival time at the next point in the case that a user moves at a normal and a maximum speed (step S212). The monitor 1210 then makes a matching between the expected arrival time at the next point and the rules within the action rule base 1220 (step S213), and executes the action of the corresponding rule (step S214).
  • Navigation Plan Creating and Information for Guidance Managing System
  • The method for creating a naviscript was described earlier. Provided below are the explanations about a method for managing various guidance data (information items for guidance) corresponded to points or times, and a method for making a combination of information items for guidance (hereinafter referred to as a navigation plan). [0350]
  • Conventionally, there are several systems for creating a navigation route to a destination. For example, with a car navigation system, route navigation is created based on the map information such as a distance, an intersection name, etc. that the map itself originally possesses, when a destination is specified. However, there was no navigation system which creates route information including the explanatory information about shops, hotels, sightseeing facilities, etc. are created. As is often the case, however, navigation information such as sightseeing information, route information, etc. are demanded. [0351]
  • Additionally, a number of information databases relating to shops, hotels, and sightseeing facilities conventionally existed. However, most of them classify their information into categories, so that the information are not managed in an effective form of implementing route navigation. Since required contents of information for guidance like sightseeing information, etc. vary depending on a season or an attribute of a person receiving the information for guidance, it is desirable to easily create information for guidance or a navigation plan, which corresponds to the attribute. Therefore, a system which efficiently and easily creates a navigation plan such as a sightseeing navigation plan, a route navigation plan, etc., and a system which manages the information for guidance for creating the navigation plan are demanded. [0352]
  • With this system, for example, a user sets the route of a sightseeing tour by himself, and obtains the information for guidance corresponding to the attribute of the user at the respective points on the arbitrarily set route, from the navigation information database managing the information for guidance like an explanation of a tour conductor, so that the user can create a navigation plan. Additionally, the created navigation plan is executed by the user portable terminal while the user actually moves on the route, so that the user can enjoy the sightseeing of the individual tour plan even if a tour conductor does not attend. [0353]
  • Additionally, with the conventional car navigation system, navigation such as “Turn to the left 300m ahead”, “Turn to the right at ◯◯ Intersection”, etc. based on map information is performed, which is less natural and less understandable than the navigation made by a man in some cases. As a route navigation plan, not map information such as an intersection name, etc. but navigation information using a landmark such as a building, a signboard, etc., for example, “Turn to the left after passing through a big ΔΔ signboard” is adopted, so that the route navigation similar to that made by a man, which is natural and understandable, can be easily created. [0354]
  • FIG. 31 exemplifies the configuration of the navigation plan creating and information for guidance managing system. This system comprises a navigation [0355] plan creating device 1300 and an information for guidance database managing device 1310.
  • The information for guidance [0356] database managing device 1310 manages an information for guidance database 1311 storing information for guidance attached to a point or time in map data, a schedule, a timetable, a calendar, etc. and a navigation plan.
  • The navigation [0357] plan creating device 1300 comprises: an information for guidance attaching unit 1301 for attaching information for guidance to map/schedule data stored in the information for guidance database 1311 and for making an association between them; a condition setting unit 1302 for setting a condition such as a time period during which information for guidance is valid, a valid attribute, etc; a route setting unit 1303 for setting a route of a navigation plan; an information for guidance extracting unit 1304 for extracting information for guidance required for a navigation plan from the information for guidance database 1311; and a navigation plan creating unit 1305 for creating a navigation plan based on the extracted information for guidance.
  • A [0358] terminal 1320 comprises: a navigation plan executing unit 1321 for executing a received navigation plan according to a user point/time; a presenting unit 1322 for presenting navigation information to the user; current point obtaining unit 1323 for obtaining the current point of the user; and a time measuring unit 1324 for obtaining current time. The terminal 1320 is, for example, a car navigation device, a PC, a PDA, a PHS, a PDC, etc.
  • FIG. 32 summarizes the flow of the processing performed by this system. The information for [0359] guidance attaching unit 1301 attaches information for guidance to a point/time of map/schedule data. Here, “attach” means that information for guidance is associated with a particular point on a screen such as a map, a schedule, a calendar, etc. As occasion demands, the condition setting unit 1302 sets a time condition such as a time period during which attached information for guidance is valid or an attribute condition such as a user type, etc., and stores a set condition in the information for guidance database 1311 (step S301).
  • The [0360] route setting unit 1303 sets a route of a navigation plan. To set a route, points/areas to be included in the route are selected on the map data displayed on a display device, etc., and time/attribute conditions of a navigation plan are further set (step S302).
  • The information for [0361] guidance extracting unit 1304 extracts from the information for guidance database 1311 the information for guidance corresponding to the points/areas and the time/attribute conditions, which are set for the route of the navigation plan (step S303), and the navigation plan creating unit 1305 creates a navigation plan by using the extracted information for guidance based on the set route (step S304). The terminal 1320 executes the created navigation plan and presents the navigation information according to the user point or time (step S305).
  • FIGS. 33A and 33B exemplify the process for attaching navigation information. Graphic areas such as rectangles, ellipses, etc. indicate ranges where information for guidance are valid, and can be arbitrarily set by a user. [0362]
  • The point to which information for guidance is attached is specified by designating the point and area represented as a graphic such as a rectangle, an ellipse, etc. on the screen displaying the [0363] map data 1330 as shown in FIG. 33A, by designating a facility object such as a building, a road, etc. on a map, or by directly describing the place in a naviscript, etc. For example, a range where the Rainbow Bridge is seen is set as a rectangle on the map data 1330, and the information for guidance 1331 of the Rainbow Bridge is attached thereto. Additionally, as shown in FIG. 33B, a time slot of the schedule data 1341 in the form of one day is specified, and the information for guidance 1342 and 1343 are attached thereto. Or, a date (one or a couple of dates) of the schedule data 1344 in the form of one month is specified, and the navigation information for guidance 1345 is attached thereto.
  • Specifically, assuming that a point A is specified on the map data displayed on a display device, the information for guidance setting screen shown in FIG. 34 is displayed by the information for [0364] guidance attaching unit 1301. The information for guidance setting screen 1350 includes an information for guidance input field 1351 for directly inputting information for guidance, an image file name input field 1352 for inputting the name of a file to be used if information for guidance is image data, a voice file name input field 1353 for inputting the name of a voice data file, a reference button 1354 for referencing a specified file, a time condition setting button 1355 for setting a time condition, etc., an OK button 1356, and a cancel button 1357.
  • The contents of attached information for guidance are sightseeing navigation information, for example, “This was built in the year □□, and is famous for XX . . .”, etc. The information for guidance may be directly input from the information for [0365] guidance input field 1351, the information created by a travel agent may be used, or the information for guidance may be input by specifying its file name. Furthermore, voice or image information may be attached in addition to text data. In this case, a required file name (such as “bbb.jpg”, “aaa.wav”, etc.) is specified in the image file name input field 1352 or the voice file name input field 1353. Such information for guidance is created also for other specified points or areas in a similar manner.
  • If a time restriction is imposed on information for guidance, the time [0366] condition setting button 55 is clicked to start up the condition setting unit 1302. Then, a time condition such as a time period during which information for guidance is actually presented, a date, a time period during which information for guidance is valid, etc. are set on another setting screen (not shown). Furthermore, a direction accessing the area to which information for guidance is to be attached, etc. is specified, and also a condition of presenting the information for guidance only when being accessed from a particular direction, etc. may be set.
  • FIG. 35 shows an example of the information for guidance attachment process in the case where the information for [0367] guidance 1361 as the sightseeing navigation contents “This was built in the year □□, and is famous for XX . . .” are attached to the point A on the map data 1360.
  • An example where the result of the information for guidance attachment process is represented by a naviscript is provided below. The contents of the naviscript mean that the information for guidance is presented as text data “This was built in the year □□, and is famous for XX . . .” within a radius of 1 km from a point A (input and named by a user) at the latitude N35.11.11.111, the longitude E135.22.22.222, and at an address 1-1, ΔΔ, ◯◯ City, and the voice and the image data stored in the files “aaa.wav” and “bbb.jpg” are output. The information of the latitude, the longitude and the address of the point A may be described by obtaining the data that the map data originally hold. [0368]
    <naviscript>
    <inst>
    <point>
    <name> A </name>
    <latitude> N35.11.l1.111 </latitude>
    <longitude> E135.22.22.222 </longitude>
    <address> 1-1, ΔΔ, ◯◯, City
    </address>
    </point>
    <info area=“1km”>
    <text> This was built in the □□, and
    famous for XX . . . </text>
    <voice src = “aaa.wav/”>
    <image src = “bbb.jpg/”>
    </info>
    </inst>
  • The information for guidance attached to map or schedule data are stored in the information for [0369] guidance database 1311, and managed by the information for guidance managing device 1310. The information for guidance attached to the map or the schedule data can be displayed as a guidance sheet. By representing information for guidance as a guidance sheet, it becomes easier to verify the correspondence between the contents of attached information for guidance and map/schedule data.
  • FIGS. 36A and 36B exemplify guidance sheets. Guidance sheets [0370] 1371 can be displayed for the respective seasons as shown in FIG. 36A, or a guidance sheet 1372 can be displayed in a way such that the sheet varies as time elapses as shown in FIG. 36B. Additionally, a guidance sheet can be displayed for each user attribute such as age, sex, objectives, etc. Also the correspondence between the time-conditional information for guidance and a point to be attached can be displayed as a guidance sheet which represents the correspondence in a three-dimensional space including the time axis shown in FIG. 37.
  • When a navigation plan is created, the [0371] route setting unit 1301 first specifies a route that a user desires. The route is specified, for example, by using a method with which system automatically retrieves a route and sets the route if a departure point and a destination are specified, a method with which a user selects a point/road on a map screen with the use of a pointing device to set a route, a method for setting a route by correcting an optimum route along a line that a user draws on a map screen, etc.
  • The set route passes through some of the areas on the map data, to which the information for guidance are attached beforehand. Assume that there is [0372] map data 1380 on which information for guidance are attached to the areas including the points A to J as shown in FIG. 38A. Also assume that a user specifies the route from a start point “s” to a goal “g”, which is shown in FIG. 38A in order to create a navigation plan on the map data 1380. At this time, the areas that the route passes through are A, F and J. The information for guidance extracting unit 1304 extracts the information for guidance about the areas A, F and J from the information for guidance database 1311. The navigation plan creating unit 1305 creates a navigation plan (from the start point “s” to the goal point “g” via the points A, F and J) by using the extracted information for guidance based on the specified route. Then, a naviscript is created, for example, by giving a name “AFJ Tour” to the plan.
  • A navigation plan is represented by a naviscript, so that the navigation plan can be delivered via a network or an electronic medium. As a result, a navigation service that anybody can utilize anywhere can be realized. [0373]
  • Exemplified below is a navigation plan which is created based on the route shown in FIG. 38A and represented by a naviscript. [0374]
    <naviscript>
    <navi>
    <title> AFJ Tour </title>
    <category> historic spots tour </category>
    <transport> car </transport>
    <duration> 3 hours </duration>
    <cost> 3,000 yen </cost>
    <par>
    <inst>
    <point>
    <name> A </name>
    <latitude> N35.11.11.111 </latitude>
    <longitude> E135.22.22.222
    </longitude>
    <address> 1-1, ΔΔ, ◯◯ City
    </address>
    </point>
    <info area=“1.0km”>
    <text>This was built in the □□,
    and is famous for XX . . . </text>
    <voice src = “aaa.wav”/>
    <image src =“bbb.jpg”/>
    </info>
    </inst>
    <inst>
    <point>
    <name> F </name>
    <latitude> N35.33.33.333 </latitude>
    <longitude> E135.44.44.444
    </longitude>
    <address> 2-2, ΔΔ, ◯◯ City
    </address>
    </point>
    <info area=“2.0km”>
    <text> . . . </text>
    <voice src = “ccc.wav”/>
    <image src = “ddd.jpg”/>
    </info>
    </inst>
    <inst>
    <point>
    <name> J </name>
    <latitude> N35.55.55.555 </latitude>
    <longitude> E135.59.59.999
    </longitude>
    <address> 3-3, ΔΔ, ◯◯ City
    </address>
    </point>
    <info area=“1.0km”>
    <text> . . . </text>
    <voice src = “eee.wav”/>
    <image src = “fff.jpg”/>
    </info>
    </inst>
    . . .
    </par>
    </navi>
    </naviscript>
  • The initial portion from <title> to </cost> indicates the summary of the entire navigation plan, and the portion from <par> indicates individual navigation information. The summary is added to endow the sequence of navigation information from <par> with some meaning, and to facilitate the understanding of the sequence. By way of example, for a sightseeing tour, the contents of navigation can be known at a glance. [0375]
  • When the [0376] user terminal 1320 receives the navigation plan, the navigation plan executing unit 1321 executes the navigation plan according to the current point from the current point information obtaining unit 1323 and the current time from the time measuring unit 1324. The presenting unit 1322 presents the navigation information in real time.
  • Suppose that the information indicating that the current point of a user is within 1 km of a point F is transmitted from the current [0377] point obtaining unit 1323. In this case, the navigation plan executing unit 1321 executes the portion of the point F of the navigation plan, and presents the information for guidance about the point F to the user. Note that the information for guidance with a condition restricting execution time is not presented when the navigation plan is actually executed if the condition is not satisfied.
  • Additionally, the navigation [0378] plan executing unit 1321 can also simulate a created navigation plan by executing the plan independently of the information from the current point obtaining unit 1323 or the time measuring unit 1324.
  • By the way, a range where information for guidance is valid is sometimes specified as a point instead of an area shown in FIG. 38A. In this case, the navigation [0379] plan creating device 1300 sets an area having predetermined area in the vicinity of a specified route and extracts the information for guidance about points within the area so as to present the information for guidance attached to the points close to the specified route.
  • For example, if a route from a start point “s” to a goal “g” via points p[0380] 1, p2, and p3 is specified as shown in FIG. 38B, the route setting unit 1303 expands the route in its periphery as shown in FIGS. 38C and 38D.
  • Additionally, the information for [0381] guidance database 1311 includes a point database 1381 and a schedule database 1382, which are shown in FIG. 38E. The point database 1381 stores information for guidance such as shops 1391, restaurants 1392, parks 1393, gas stations 1394, fire hydrants 1395, etc., while the schedule database 1382 stores the information for guidance attached to schedule data.
  • At this time, the information for [0382] guidance extracting unit 1304 extracts from the point database 1381 the information for guidance about the points included in the area on the expanded route, and the navigation plan creating unit 1305 creates a navigation plan by using the extracted navigation information. In this way, even if information for guidance is attached to a point on a map, a suitable navigation plan can be created.
  • Explained next is the case where a navigation plan is used for route navigation as another use method. Assume to prepare the navigation data which are represented as a big signboard of ◯◯, “a big triangular building”, etc. using landmarks, and attached to the points on the [0383] map data 1390 in order to make route navigation for an intersection or a branch road, in the information for guidance database 1311. Here, if a user requests the route navigation from “ΔΔ Station” to the destination shown in FIG. 39, the route setting unit 1303 creates a route, and the information for guidance extracting unit 1304 extracts information for guidance for the respective points on the route. For a point not included in the information for guidance database 1311 (a point at which no landmark exists), information for guidance is created by using the information that the map data originally possesses (an intersection name, and the like). The navigation plan creating unit 1305 creates a navigation plan by combining the information guidance and turn directions, etc.
  • To provide the map shown in FIG. 39 as a navigation plan, it may be sufficient to display the information guidance extracted from the information for [0384] guidance 1311, the information for guidance that the map originally possesses, and the route. To make further detailed navigation, information for guidance such as a turn direction, etc. must be added.
  • The user carries the terminal [0385] 1320 storing the created navigation plan, and starts to walk from the ΔΔ Station being the departure point. When being notified that the user approaches a certain point from the current point obtaining unit 1323, the navigation plan executing unit 1321 presents the information for guidance attached to the point at the corresponding point on the route. In this way, route navigation, for example, “Go straight from the Station, pass through the intersection A, proceed obliquely right at the big signboard of ◯◯, and pass the intersection at which the big triangular building exists, so that the destination can be found” is made by presenting the route navigation information at the respective points. As described above, not only point and intersection names as conventional, but also route navigation that a user can easily understand can be created and delivered.
  • Additionally, the navigation [0386] information managing device 1310 manages navigation information or navigation plans as a database, so that information for guidance or a navigation plan can be efficiently created and delivered. Furthermore, information for guidance and navigation plans are altogether managed, so that it becomes easy to retrieve, evaluate, etc. information for guidance or a navigation plan.
  • Specifically, the information for [0387] guidance database 1311 together stores navigation data such data requiring almost the same time or cost, routes passing through the same point, data classified according to a user attribute or a season, etc., which leads to an increase in a retrieval efficiency.
  • Furthermore, if a navigation plan which directly matches is not stored, a navigation plan satisfying the request is created by combining stored navigation plans. Supposing that there are navigation plans for the routes “A, B, C, D, E” and “K, L, C, X, Y, Z”, a navigation plan for a route “A, B, C, X, Y, Z” is created based on the common point C. Still further, a particular navigation plan can be called by attaching a code to information for guidance or a navigation plan, and by inputting the code. [0388]
  • As described above, this system can easily create a navigation plan, easily access information for guidance having the contents like sightseeing navigation, and make retrieval. [0389]
  • System for Processing Information with a Time/Point Condition
  • Provided below is the explanation about a system for processing information with a time/point presentation condition. [0390]
  • Conventionally, when a user retrieves required information, for example, when the user retrieves the information about a certain point and restaurants close to that point, a method for making retrieval by restricting a condition with an area name, etc. based on the information list of restaurants, is normally used. Additionally, there is a method for providing information according to a user taste such as a push technique, agent technique, etc. Furthermore, there is a technique relating a portable information system for determining the point data of a user with a GPS receiver, for retrieving from a database with a retrieval key information about the data obtained by arbitrarily combining a point, orientation, proceeding direction, eye direction, speed, altitude, date, time, etc., which are obtained from the user point, and for outputting the retrieved information (Published Japanese Translation of PCT International Publication for Patent Application No. 8-510578, GPS Retrieving Apparatus). Still further, a technique for transmitting information at a particular time is general. [0391]
  • However, there is no system which imposes a time/point restriction on information itself as a condition of presentation, etc. of the information, and processes the information according to the presentation condition (a time/point restriction). The validity or the value of information varies depending on when and where the information is received. Therefore, if it is possible to present the information at the most effective time or point, the efficiencies of both of receiving and transmitting sides increase. [0392]
  • However, if a presentation condition such as a time or a point, etc. is not described in information itself, a system side must classify and arrange presentation conditions, which leads to a heavy load on a man or a system executing automatic processing. Accordingly, it is difficult to provide a service according to a suitable time or point. Additionally, if a terminal side extracts the information that a user desires to be presented from information of a large amount, it imposes a heavy operation load on the user. To automate this process, various pieces of hardware such as a time/point determining unit, a suitable information extracting unit, etc. and software are required. Therefore, it is disadvantageous to a terminal whose memory, throughput, etc. are restricted. [0393]
  • This system is intended to process information with an absolute or relative time/point condition like a naviscript, and to present the information only to a user satisfying the condition. [0394]
  • FIG. 40 exemplifies the configuration of the system for processing information with a time/point presentation condition. This system comprises: a [0395] processing device 1400 having an information retrieving unit 1401, a target user retrieving unit 1402, a range condition processing unit 1403, a relative condition processing unit 1404, and an information transmitting unit 1405; a conditional information database 1410 for storing information with a presentation condition on which a time/point restriction is imposed; a time measuring unit 1411 for measuring time; an estimating module 1412 for estimating a time/point, which is specified relatively in a condition; various sensors 1413 for transmitting various measured values or data; a point information managing system 1415 for identifying the point of users, etc.; and a terminal 1430.
  • The [0396] information retrieving unit 1401 within the processing device 1400 obtains a time from the time measuring unit 1416, and retrieves the information with the condition corresponding to the time. The target user retrieving unit 1402 retrieves a target user satisfying a condition on which a point restriction is imposed based on the point information obtained from the point information managing system 1415. If a condition includes time/point range (area) specification, the range condition processing unit 1403 controls the presentation of information within the range according to the specification. The relative condition processing unit 1404 retrieves information with a condition including a relatively specified time/point, and processes the condition based on the data from the estimating module 1412 and the sensors 1413. The information transmitting unit 1405 transmits information only to a user terminal 1430 which satisfies the condition according to the condition of the retrieved information.
  • The [0397] terminal 1430 is, for example, a PDA, a PDC, a PHS, a car navigation system, a mobile PC, a wearable computer, a radio, etc., which comprises a presenting unit 1431 for receiving information from the information transmitting unit 1405 within the processing device 1400, and for presenting the information.
  • This system stores information on which a time/point restriction is imposed as a presentation condition in the [0398] conditional information database 1410, retrieves information satisfying the condition based on the time measured by the time measuring unit 1411, selects users satisfying the point restriction of the condition based on the point information from the position information managing system 1415, and presents the information to the terminal 1430 of the selected users.
  • As described above, by timely transmitting and presenting information to an adequate user based on a presentation condition such as a time, a point, etc. which accompanies the information, a user or a system workload can be reduced, whereby also an information provider can efficiently provide information. To be more specific, it becomes possible to efficiently provide information when transmitting news, weather, an advertisement, and leisure information such as a leisure spot, a restaurant, etc., or when performing one-to-one marketing, direct marketing, schedule management, etc. [0399]
  • The [0400] conditional information database 1410 stores information with a presentation condition on which a time/point restriction is imposed. The conditional information can be simply described and specified by text in a predetermined format. Or, the conditional information can be created by using a GUI in a similar manner as in the above described navigation plan creating and information for guidance managing system. Additionally, an index may be automatically created by extracting from existing information a keyword regarding a time or a point. Retrieval, management, usage are facilitated by using a particularly predetermined format like a naviscript.
  • Specification of a time/point restriction includes the following types. [0401]
  • Time specification: Presenting all of people at a specified time [0402]
  • Point specification: Presenting all of people belonging to a specified place. [0403]
  • Time and point specification: Presenting all of people belonging to a specified point at a specified time. [0404]
  • For more flexible condition setting/information presentation to a user, an absolute or a relative range of a time/point is allowed to be specified. [0405]
  • The information stored in the [0406] conditional information database 1410 includes, for example, event information (such as a concert, a sports game, fireworks, a department store sale, etc.), restaurant information, sightseeing navigation, route navigation, facility navigation, news, weather forecast, television/radio program schedule, traffic information, horoscope, attention alert, a manual, mail, etc.
  • Provided below are the examples of information which a time/point restriction is imposed on and is stored in the [0407] conditional information database 1410. Here, the information are represented in the form of (presentation condition: information)
  • (1) Conditional information including absolute time specification
  • (at 7 am: Today's weather is fine and occasionally cloudy, probability of rainfall is 30% . . .) [0408]
  • This is an example of the case where weather forecast is presented at 7 am. [0409]
  • (at 4 pm: Starting a sale for 30 minutes from now) [0410]
  • This is an example of the case where bargain information is presented at 4 pm. [0411]
  • (2) Conditional information including time range specification
  • (between 11 and 12: Recommended restaurant information, Specialty recommended by chef of Italian restaurant ΔΔΔΔ is Milanese rice casserole . . .) [0412]
  • This is an example of the case where restaurant information is presented between 11:00 and 12:00 [0413]
  • (until 12/20: Recommended new movie, □□□□□□ starring ◯◯ ◯◯◯ . . .) [0414]
  • This is an example of the case where movie information is presented until December 20th. [0415]
  • (3) Conditional information including relative time specification
  • A time restriction condition may be specified also with a relative time, for example, (3 days after information A:), (1 week after the preceding display:), (3 minutes before the arrival at a point P), etc. [0416]
  • (4) Conditional information including time cycle specification
  • (every Tuesday: nonflammable garbage day) [0417]
  • This is an example of information presented by a municipality every Tuesday. [0418]
  • (every 3 hour: refresh time) [0419]
  • This is an example of information presented to a user driving a car at predetermined time intervals. [0420]
  • Similarly, a time restriction condition can be specified by using a description such as “till”, “by”, “about”, “in”, “after”, “before”, “since”, “as”, “when”, “while”, “now”, “then”, “once”, “during”, “within”, etc. [0421]
  • (5) Conditional information including absolute point restriction specification
  • (latitude 36.2.5, longitude 133.33.36: This is Tokyo Station) [0422]
  • This is an example of information presented by specifying a point with unique determinants such as latitude, longitude, an address, etc. [0423]
  • (6) Conditional information including point range restriction specification
  • (Between the point A and the point B: Jam percentage information between “A” and “B” is . . .) [0424]
  • This is an example of information presented between the point A and the point B. [0425]
  • Similarly, point range specification such as (a point within a radius of 500 m from a ◯◯ facility:), (on a highway Δ:), etc. or range specification such as “within a city”, “within a shop”, “within a building”, “a platform of a station”, etc. can be made. [0426]
  • (7) Conditional information including relative time restriction specification
  • Relative specification is made by a function of a matter yet to be determined. At this time its condition is not determined when the specification is described. The condition is determined when the information is actually delivered. Examples of this type of information include (1 km before the point at which you stay at 12:00:), (within a radius of 300 m from a point at which Mr. ◯◯ stays:), etc. [0427]
  • Additionally, a point restriction condition can be specified for a movement of a relationship between a man and a point as a derivative from the range specification. Examples include (when approaching Tokyo Station: a timetable of Tokyo Station), (Within Chiba City: sightseeing navigation of Chiba Prefecture), (from Nagoya area: a correspondence table between the Nagoya dialect and the standard Japanese), (toward Hokkaido: weather of Hokkaido), (apart from Tokyo: leisure spots information), etc, [0428]
  • Similarly, a point restriction condition may be specified by using a description such as “at”, “around”, “in”, “to”, “from”, “on”, “near”, “under”, “above”, “up”, “down”, “for”, “toward”, “apart from”, “through”, etc. [0429]
  • Note that a condition restricting the number of presentation times or the number of target people can be specified in a time or point condition. For example, specification such as (3 times by May 10:), (with a limitation of 300 persons residing in the district A: (information is transmitted to 3000 persons selected at random from among target people)), etc. is made. [0430]
  • Provided next is the explanation about a conditional information process. The [0431] processing device 1400 performs the following process.
  • (1) Process for an absolutely specified condition
  • First of all, the information/[0432] condition retrieving unit 1401 retrieves conditional information corresponding to a time, which is measured at the predetermined time intervals by the time measuring unit 1411, from the conditional information database 1410. The predetermined time interval measured by the time measuring unit 1411 is defined to be the shortest possible time interval allowed in the process performed by the information retrieving unit 1401, or a prescribed time interval. For example, the time measuring unit 1411 notifies a time every 10 minutes, so that all the information with a specified time condition, which corresponds to this time interval (10 minutes), are retrieved. Additionally, if the retrieval process requires a considerable amount of time, conditions may not be retrieved in real time but retrieved beforehand at prescribed time intervals, and then the information with the corresponding condition may be extracted each time the corresponding time is actually reached.
  • The target [0433] user retrieving unit 1402 obtains the current point of users from the position information managing system 1415, and retrieves the users staying at the point satisfying the place restriction condition within the information retrieved by the information/condition retrieving unit 1401. As a method for identifying the point of a user, a system having a self-position identifying system such as a GPS, etc. is made to periodically transmit its own point, or the point information managing system 1415 of a PHS or a cellular phone, etc. can be used.
  • The [0434] information transmitting unit 1405 transmits the retrieved information only to the user terminals 1430 retrieved by the target users retrieving unit 1402. As a method for transmitting information to particular user terminals 1430, a method for specifying the IDs of the receiving terminals 1430 and transmitting information is generally used. However, another method for transmitting information to terminals 1430 within a particular range/area can be used, for example, by making an output adjustment such that electronic waves reach only the particular range/area in a broadcasting manner.
  • (2) Process for a range condition when a range is specified
  • If the condition that the [0435] information retrieving unit 1401 retrieves from the conditional information database 1410 includes time/point range specification, the range condition processing unit 1403 performs the process as follows.
  • When the information satisfying the time/point restriction condition is presented to a [0436] user terminal 1430 via the information transmitting unit 1405, the range condition processing unit 1403 sets a presentation flag of this information to ON in order to prevent the information from being transmitted to the same user while the presentation flag is ON. When the user exits the range, the range condition processing unit 1403 sets the presentation flag to OFF. Or, when the user first enters this range, the information is presented only once or a predetermined number of times. When the identical information is presented a plural number of times, it can be also presented periodically at predetermined time intervals while the user stays within the range.
  • The capability of the range condition processing unit may be arranged within the [0437] terminal 1430. In this case, the range condition processing unit 1403 adjusts the number of presentation times in consideration of other information so that information is presented a required number of times, which suits the amount of information obtained within a unit time period. For example, if there are already three items of information for the user having the amount of information obtained within a unit time period such as 5 times a day, another item of information is adjusted to be displayed twice. As the amount of information obtained within a unit time period, the technique proposed by the Japanese Patent Application No. 10-270672 “Information Presenting Apparatus for Adjusting and Presenting Information and a Method Thereof” can be used. However, since this does not directly relate to the gist of the present invention, its detailed explanation is omitted here.
  • Additionally, the range [0438] condition processing unit 1403 continues to present the information satisfying the range restriction to the presenting unit 1431 as long as the user stays in the range specified by the information. When a plurality of items of information with the corresponding condition are transmitted, the most recently received information (information A) is overwritten and presented on a presentation screen 1461 as shown in FIG. 41A. Or, all of candidates of the information satisfying the condition are displayed as a menu 1462 as shown in FIG. 41B. Information are switched in turn by pressing a button 1463, so that required information is displayed. If all the information cannot be assigned to a displayed menu, as shown in FIG. 41C, the rightmost menu button “
    Figure US20010056443A1-20011227-P00900
    ” in the menu 1462 indicates that further information candidates yet to be displayed exist. Information to be presented can be manually selected by pressing the button 1463 also in this case. Additionally, information items to be displayed are allowed to be selected, for example, by displaying the information in an order of recency of the received information items or in descending order of the priorities of the information items. As the details of the menu display method, also the technique proposed by the Japanese Patent Application No. 10-200237 “Electronic Processing Device Having a Menu Interface” can be used.
  • Furthermore, if running schedule information of a movie is presented during a running period, the information may be presented a prescribed number of times, for example, five times until the end of the running period, on the condition that the amount of information that a user receives has a margin. In this case, the information may be presented according to a function which presents the information more frequently as the running period approaches the end. [0439]
  • (3) Process for a relatively specified condition
  • If the condition retrieved by the [0440] information retrieving unit 1401 includes a relatively specified time/point restriction, the relative condition processing unit 1404 extracts the information having this condition when the condition including the relatively specified restriction is uniquely determined, and defines the extracted information to be a presentation target. For example, if the time restriction of the presentation condition within information B is “3 days after information A”, whichever day “3 days after” indicates cannot be identified. When the information A is received, the time condition of the information B is determined.
  • Accordingly, the relative [0441] condition processing unit 1404 extracts the information B as a presentation target upon receipt of the information A, and presents the information B 3 days after the receipt of the information A.
  • If the time restriction of the condition of information C is “1 week after a previous display”, the condition of the information C is determined when the information C is displayed at least once. Therefore, the relative [0442] condition processing unit 1404 extracts the information C as a presentation target, and performs the presentation process. Similarly, if the point restriction of the condition within information D is “within a radius of 300 m from the point at which Mr. ◯◯ stays”, the information D is extracted as a presentation target while the point of Mr. ◯◯ can be identified. The process for presenting the information D is performed during that time period.
  • Additionally, if the time restriction of a condition is “3 minutes before the arrival at a point P” or “1 km before the point to be stayed at 12:00”, it is impossible to satisfy the condition and present the information after the arrival time at the point P or the point to be stayed at 12:00 is identified. Therefore, when the information having such a condition is presented, the relative [0443] condition processing unit 1404 applies to the condition the value calculated by the estimating module 1412 for estimating the arrival time at the point P or the point to be stayed at 12:00, and performs the presentation process.
  • Suppose that the presentation condition of information E is “3 minutes before the arrival at the point P”, the arrival time at the point P is estimated to be 10:00, and the time 9:57 satisfies the time restriction of the condition. In this case, the relative [0444] condition processing unit 1404 determines that the time restriction of the information E is 9:57. When the time 9:57 is notified from the time measuring unit 1411, the information/condition retrieving unit 1401 retrieves from the conditional information database 1410 the information E as the information having the time condition corresponding to this time, and presents the retrieved information. Similarly, if the estimating module 1412 estimates that the point to be stayed at 12:00 is a point A, the relative condition processing unit 1404 performs the process by applying the point A to the condition, and presents the retrieved information 1 km before the point A in which the point condition is recognized to be satisfied.
  • Such estimation can be made if an action plan is known. Assume that, a user driving a car with a car navigation device predetermines a destination, and drives according to a calculated route. In this case, an arrival time can be estimated according to a speed per hour, jam percentage of a road, etc., and also a point to be stayed at a particular time can be approximately estimated. Additionally, if a user is moving with route information described by a naviscript, it is possible to estimate where the user stays at a specified time even if walking or by train (for example, the [0445] scheduler 1200 shown in FIG. 27 can be used).
  • The configuration of this system varies depending on where a time/pomt restriction of time/point-conditional information is recognized to select information and where the information is presented to a user. An example of the configuration of this system is provided below. [0446]
  • FIG. 42 shows a first example of the system configuration in the case where time-conditional information is selected and processed on a server side. The server side comprises the [0447] information retrieving unit 1401, the range condition processing unit 1403, the relative condition processing unit 1404, the information transmitting unit 1405, the time measuring unit 1411, the estimating module 1412, sensors 1413 in addition to the conditional information database 1410. The server side extracts the information having a corresponding time condition at predetermined time intervals, and transmits the extracted information to all of user terminals 1430 having an information reception capability. The relative condition process and the range condition process are performed on the server side. The terminals 1430 only present information to users. Remember that the range condition processing unit 1403 for performing the range condition process may be comprised on the terminal 1430 side.
  • FIG. 43 shows a second example of the system configuration in the case where time-conditional information is processed on a terminal side. A server side only comprises an [0448] information transmitting unit 1405 other than the conditional information database 1410. The terminal side 1430 comprises the information retrieving unit 1401, the range condition processing unit 1403, the relative condition processing unit 1404, the time measuring unit 1411, the presenting unit 1431, and an information buffer 1432. Time-conditional information transmitted from the server side are received on the terminal 1430 side. The received information are selected based on the time condition, and the selected information is presented to a user.
  • Since the range condition process or the relative condition process is performed on the terminal [0449] 1430 side in this case, a user can set the processing method for each of the conditions. Setting examples include “if “till” is used, notification is made 3 days or 1 day before a specified date”, “displayed only once when entering a range” as the range specification, “displayed only in a determinate case (the estimating module is not used)” as the relative specification, etc. Note that the amount of information to be presented to a user and its timing may be changed by adjusting the amount of information and by assigning the priorities of time specification and the adjustment of the amount of information.
  • FIG. 44 shows the flow of the process performed when time-conditional information is processed. If the time condition assigned to the information includes relative specification, the relative condition process is performed by the relative condition processing unit [0450] 1404 (step S401). A time is obtained by the time measuring unit 1411 (step S402), and the information having the condition corresponding to the obtained time is retrieved by the information retrieving unit Ad~ 1401 (step S403). If the condition of the retrieved information includes time range specification, the range condition process is performed for this information by the range condition processing unit 1403 (step S404). Then, the retrieved information is presented to the user terminal 1430 (step S405).
  • Steps S[0451] 401 through S404 are performed on either of the server and the terminal sides, while step S405 is performed on the terminal side.
  • FIG. 45 shows a third example of the system configuration in the case where point-conditional information is processed on a server side. The server side comprises the target [0452] user retrieving unit 1402, the range condition processing unit 1403, the relative condition processing unit 1404, the information transmitting unit 1405, the estimating module 1412, sensors 1413, the point information managing system 1415 in addition to the conditional information database 1410. The terminal 1430 side only comprises the presenting unit 1431. The server side transmits information directly to a user at a point or within a range corresponding to a condition, and further transmits corresponding information if it receives new information or if a user range changes. Here, point-conditional information corresponding to a user point may be retrieved based on the user point, and the retrieved information may be provided. Or, a user corresponding to each point condition within the information may be retrieved, and the information may be presented to the user.
  • FIG. 46 shows the flow of the process performed when point-conditional information is processed. If a point condition assigned to information includes relative specification, the relative condition process is performed by the relative condition processing unit [0453] 1404 (step S411). The point information of a user is obtained from the point information managing system 1415 (step S412), and a user corresponding to the point restriction is retrieved by the target user retrieving unit 1402 (step S413). If the condition of the retrieved information includes point range specification, the range condition process is performed for this information by the range condition processing unit 1403 (step S414). The information is then presented to the retrieved user (step S415).
  • Steps S[0454] 412 through S414 are operations performed on a server side, while steps S414 and S415 are performed on a terminal side.
  • FIG. 47 shows a fourth example of the system configuration in the case where point-conditional information is processed on a terminal side. A server side comprises only the [0455] information transmitting unit 1405 other than the conditional information database 1410. The terminal side 1430 comprises the information retrieving unit 1401, the range condition processing unit 1403, the relative condition processing unit 1404, the presenting unit 1431, the information buffer 1432 for storing a process result of a relative condition, and a self-point identifying unit 1433 as a replacement of the point information managing system 1415.
  • Even if the server only transmits point-conditional information, the information having the place condition suitable for a user point can be presented when the terminal [0456] 1430 side comprises the self-point information identifying unit 1433 (such as a GPS device, and the like). Additionally, the user side can specify a point restriction range. By way of example, “receiving only information within a radius of 1 km from a transmitting source of point-conditional information (the center of a specified range within the information)”, “presenting information only when a corresponding point exists on a route during a move, regardless of a specified point range), etc. are specified. The process of the terminal 1430 in this configuration example is similar to that explained by referring to FIG. 46.
  • Other examples of the system configuration in the case where time/point conditional information is processed fall into the following 4 types. [0457]
  • (1) System configuration where a server side identifies a time/point condition, while a terminal side does not perform a process. [0458]
  • (2) System configuration where a server side identifies a time condition, while a terminal side identifies a point condition. [0459]
  • (3) System configuration where a server side identifies a point condition, while a terminal side identifies a time condition. [0460]
  • (4) System configuration where a server side does not perform a process, while a terminal side identifies a time/point condition. [0461]
  • Even if a server only transmits time/point conditional information, the information having a time/point condition, which is suitable for a user, can be presented by the presenting [0462] unit 1431 when the terminal 1430 side comprises the self-point identifying unit 1433 (such as a GPS device, and the like).
  • For the identification of a time condition and that of a point condition, either may be performed ahead. However, the system load can be reduced if more efficient identification is selected based on a distribution amount of each conditional information. Accordingly, it is normally considered to be more advantageous that the identification of a time condition is made ahead. Otherwise, information may be transmitted by identifying a point condition on the server side, and information satisfying a time condition may be further identified from the transmitted information on the terminal side, so that the identified information is presented to a user. [0463]
  • FIG. 48 shows the flow of the process performed when a server side processes time point conditional information. If the condition assigned to the information includes relative specification, the relative condition process is performed by the relative condition processing unit [0464] 1404 (step S421). Next, a time is obtained by the time measuring unit 1411 (step S422), and the information having the condition corresponding to the obtained time is retrieved by the information retrieving unit 1401 (sep S423). The point information of users is then obtained from the point information managing system 1415 (step S424), and a user satisfying the point restriction is retrieved by the target user retrieving unit 1402 (step S425). If the condition of the retrieved information includes time/point range specification, the range condition process is performed by the range condition processing unit 1403 (step S426). The information is transmitted to the terminal 1430 of the selected user, and the selected information is presented by the presenting unit 1431 (step S427).
  • FIG. 49 shows the flow of the process performed when a terminal side selects time/point-conditional information. steps S[0465] 431 through S435 are operations performed on the terminal 1430 side. If the time/point condition of the information transmitted from a server side includes relative specification, the relative condition process is performed by the relative condition processing unit 1404 (step S431). Then, the time and the self-point are respectively obtained by the time measuring unit 1411 and the self-position identifying unit 1433 (step S432), and the information having the condition corresponding to the time/point retrieved by the information retrieving unit 1401 is selected. Furthermore, if the condition of the selected information includes time/point range specification, the range condition process is performed for this information by the range condition processing unit 1403 (step S434). The selected information is then presented by the presenting unit 1431 (step S435).
  • FIG. 50 shows a sixth example of the system configuration where time/point-conditional information is processed by a terminal having a self-schedule managing capability. In this case, the terminal [0466] 1430 side comprises the information retrieving unit 1401, the range condition processing unit 1403, the relative condition processing unit 1404, the time measuring unit 1411, and the self-point identifying unit 1433, as units for identifying a time/point condition. The terminal 1430 side further comprises the inputting unit 1434 for inputting as time/point-conditional information also the information of a schedule of a local user or the group to which the user belongs. With this configuration, both of general time/point-conditional information and the information such as an individual schedule, etc. can be presented according to a time/point condition, and the user can receive only the information satisfying the condition.
  • FIG. 51 shows the flow of the process performed when time/point-conditional information is processed by the terminal having the self-schedule managing capability. First, the time/point-conditional information transmitted from a server side is received (S[0467] 441). Schedule information is input from the inputting unit 1434 (step S442). If the condition assigned to the information includes relative specification, the relative condition process is performed by the relative condition processing unit 1404 (step S443), and a time and a self-point are respectively obtained by the time measuring unit 1411 and the self-point identifying unit 1433 (step S444). The information having the condition corresponding to the obtained time/point is retrieved by the information retrieving unit 1401 (step S445). Furthermore, if the condition of the retrieved information includes time/point range specification, the range condition process is performed by the range condition processing unit 1403 for this information (step S446). The retrieved information is then presented by the presenting unit 1431 (step S447).
  • An example where conditional information is described by a naviscript is provided below. [0468]
    <par>
    <inst>
    <point>
    <latitude> N35.11.11.111 </latitude>
    <longitude> E135.22.22.222 </longitude>
    <address> 1-1, ΔΔ, ◯◯ City </address>
    </point>
    <info area = “1.0km”>
    <text> This is built in the year □□,
    and famous for XX . . . </text>
    <voice src=”aaa.wav“/>
    <image src=”bbb.jpg“/>
    </info>
    </inst>
    </par>
  • The contents of this naviscript mean that the text data “This is built in the year □□, and famous for XX . . .”, the voice data within the file “aaa.wav”, and the image data within the file “bbb.jpg” are presented to the user staying within a range of the radius of 1.0 km from the point at the latitude of N35.11,11,111 and the longitude of E135.22.22.222. Information in the above described explanation means information contents which are significant to the user. However, information is not limited to the above described type of information. Information which is not significant to a user can be processed in a similar manner as a signal passing through a machine. [0469]
  • Specification of the Naviscript Language
  • Explained below is the details of the specification such as tags, attributes, and contents, which are used by the naviscript language in the above described preferred embodiments. The relationship between a tag, an attribute, and a content (data not including a tag set is called a content) is: [0470]
  • <tag attribute=“attribute” value> content [0471]
  • </tag>[0472]
  • (a) the highest-order description [0473]
  • (1) tag:<naviscript>[0474]
  • indicates that this description is a naviscript. [0475]
  • attribute: [0476]
  • version—indicates the version of the naviscript. [0477]
  • content: the following tag set can be included herein. [0478]
  • <title>, <version>, <author>, <affiliation>, <date>, <copyright>, <comment>, <navi>, <inst>, <point>, <object>[0479]
  • (b) a description below <naviscript>[0480]
  • (2) tag: <title>[0481]
  • attribute: none [0482]
  • content: the title of navigation described by this naviscript is written. [0483]
  • example: Rainbow Town Tour [0484]
  • (3) tag: <version>[0485]
  • attribute: none [0486]
  • content: the version of the navigation described by this naviscript is written. [0487]
  • example: example-04[0488] 05
  • (4) tag: <author>[0489]
  • attribute: none [0490]
  • content: the author of the navigation described by this naviscript is written. [0491]
  • example: Ai Ueo, Kaki Kukeko [0492]
  • (5) tag: <affiliation>[0493]
  • attribute: none [0494]
  • content: to which company the navigation described by this naviscript belongs. [0495]
  • example: Fuji Kanko [0496]
  • (6) tag: <date>[0497]
  • attribute: none [0498]
  • content: the date on which the navigation described by this naviscript is written. [0499]
  • example: 98/09/10 [0500]
  • (7) tag: <copyright>[0501]
  • attribute: none [0502]
  • content: the copyright of the navigation described by this naviscript is written. [0503]
  • example: All Rights Reserved, Copyright (©) FujiLab Ltd. 1998 [0504]
  • (8) tag: <comment>[0505]
  • attribute: none [0506]
  • content: the comment on the navigation described by this naviscript is written. [0507]
  • (9) tag: <navi>[0508]
  • attribute: none [0509]
  • content: the following tag set can be included. [0510]
  • <title>, <author>, <date>, <country>, <area>, <genre>, <duration>, <distance>, <cost>, <course>, <comment>, <seq> or <par>[0511]
  • (c) a description below <navi>[0512]
  • (10) tag: <title>[0513]
  • attribute: none [0514]
  • content: the title of contents to be navigated is written. [0515]
  • example: Rainbow Town [0516]
  • (11) tag: <author>[0517]
  • attribute: none [0518]
  • content: the author of the contents to be navigated is written. [0519]
  • example: Fuji Kanko [0520]
  • (12) tag: <date>[0521]
  • attribute: none [0522]
  • content: the date of the contents to be navigated is written. [0523]
  • example: 98/09/10 [0524]
  • (13) tag: <country>[0525]
  • attribute: none [0526]
  • content: the name of the country to which points, routs, and facilities to be navigated belong is written. [0527]
  • example: Japan [0528]
  • (14) tag: <area>[0529]
  • attribute: none [0530]
  • content: an area to which a point, a route, and a facility to be navigated belong is written. [0531]
  • example: Tokyo, Odaiba [0532]
  • (15) tag: <genre>[0533]
  • attribute: none [0534]
  • content: a category to which the contents to be navigated belongs is written. [0535]
  • example: driving, viewing [0536]
  • (16) tag: <duration>[0537]
  • attribute: none [0538]
  • content: a required time of a course to be navigated is written. [0539]
  • example: 3 [0540] hour 40 min
  • (17) tag: <distance>[0541]
  • attribute: none [0542]
  • content: a travel distance of the course to be navigated is written. [0543]
  • example: 95.0 km [0544]
  • (18) tag: <cost>[0545]
  • attribute: none [0546]
  • content: a required cost of the course to be navigated is written. [0547]
  • example: 1940 yen [0548]
  • (19) tag: <course>[0549]
  • attribute: none [0550]
  • content: the course to be navigated is written. [0551]
  • example: Kaihinmakuhari—Tokyo—Rainbow Bridge—Fujisan TV—Tokyo [0552]
  • (20) tag: <comment>[0553]
  • attribute: none [0554]
  • content: a comment on the contents to be navigated is written. [0555]
  • (21) tag: <seq>[0556]
  • “seq” means “sequential”. <seq> indicates that included items are sequentially executed. [0557]
  • attribute: [0558]
  • time-optimal—items relating to <point> included in the tag set of <seq> are rearranged so as to minimize a required time, and the rearranged items are sequentially executed. [0559]
  • distance-optimal—items relating to <point> included in the tag set of <seq> are rearranged so as to minimize a required distance, and the rearranged items are sequentially executed. [0560]
  • cost-optimal—attributes relating to <route> included in the tag set of <seq> are determined so as to minimize a required cost, and the result is sequentially executed. [0561]
  • content: the following tag sets or an arbitrary combination of an arbitrary number of them can be included. [0562]
  • <inst>, <seq>, <par>[0563]
  • (22) tag: <par>[0564]
  • “par” means “parallel”. <par> indicates that included items are executed in parallel. [0565]
  • attribute: none [0566]
  • content: the following tag sets or an arbitrary combination of an arbitrary number of them can be included. [0567]
  • <inst>, <seq>, <par>[0568]
  • (d) a description below <seq> or <par>, or below <naviscript>[0569]
  • (23) tag: <inst>[0570]
  • “inst” means “instruction”. [0571]
  • attribute: [0572]
  • id—an ID for making an internal or external reference is assigned. [0573]
  • example: id=“inst-info-introduction”[0574]
  • ref—an internal or external <inst> is referenced by describing an ID assigned to the <inst>. [0575]
  • example: ref=“inst-info-introduction”[0576]
  • if—a condition of whether or not to execute an instruction is written. If the condition is satisfied, the instruction is executed. If not, the instruction is not executed. [0577]
  • example: if=“(ref(inst-point-Daiba IC# time) &ge 11:30) && (ref(inst-point-Daiba IC#time) &le 13:30)”[0578]
  • This condition means “if the contents of the tag set of <time> within the tag set to which the ID of Daiba IC is assigned indicate 11:30 or later, and 13:30 or earlier”. The symbols and their meanings used within the attribute “if” are as follows. [0579]
  • relational operator [0580]
  • &eg or ==(=: equal) [0581]
  • The left side is equal to the right side. [0582]
  • &ne or !=(!=: not equal) [0583]
  • The left side is not equal to the right side. [0584]
  • &le [0585]
  • The left side is equal to or less than the right side. (<=: less or equal) [0586]
  • &ge [0587]
  • The left side is equal to or greater than the right side. (>=: greater or equal) [0588]
  • &lt [0589]
  • The left side is less than the right side. (<: less than) [0590]
  • &gt [0591]
  • The left side is greater than the right side. (<: greater than) [0592]
  • Logical Operator [0593]
  • &not or ![0594]
  • negation (NOT) [0595]
  • &and [0596]
  • and (AND) [0597]
  • &or [0598]
  • or (OR) [0599]
  • content: the following tag sets or an arbitrary combination of an arbitrary number of them can be included. [0600]
  • <time> or <point> or <location> or <object>, and <route>, <info>[0601]
  • (e) a description below <inst> or <naviscript>[0602]
  • (24) tag: <time>[0603]
  • “time” indicates a time at which navigation is performed. [0604]
  • attribute: [0605]
  • id—an ID for making an internal or external reference is assigned. [0606]
  • ref—an internal or external <time> is referenced by describing the ID assigned to the <time>. [0607]
  • content: a time at which navigation is performed. [0608]
  • A time can be specified both absolutely and relatively as follows. [0609]
  • <time> 12:00</time>[0610]
  • Absolute time specification “at 12:00”. [0611]
  • <time>+5 sec </time>[0612]
  • Relative time specification “5 seconds after a preceding instruction”. [0613]
  • <time>−10 min</time>[0614]
  • Relative time specification “10 minutes before a succeeding instruction. [0615]
  • (25) tag: <point>[0616]
  • “point” indicates a point at which navigation is performed. “point” absolutely stipulates a point. [0617]
  • attribute: [0618]
  • id—an ID for making an internal or an external reference is assigned. [0619]
  • example: id=point-Daiba IC [0620]
  • ref—an internal or external <point> is referenced by describing the ID assigned to the <point>. [0621]
  • example: ref=“point-Daiba IC”[0622]
  • ref=“http://www.naviscript.com/japan/tokyo/odaiba.nav#point-breakwater”[0623]
  • content: the following tag sets can be included. [0624]
  • <name>, <category>, <latitude>, <longitude>, <altitude>, <cost>, <comment>[0625]
  • These tag sets can be recognized as the elements for stipulating a point at which navigation is performed. [0626]
  • (f) a description below <point>[0627]
  • (26) tag: <name>[0628]
  • attribute: none [0629]
  • content: the name of the point is written. [0630]
  • (27) tag: <category>[0631]
  • attribute: none [0632]
  • content: the category of the point is written. [0633]
  • example: station [0634]
  • (28) tag: <latitude>[0635]
  • attribute: none [0636]
  • content: the latitude of the point is written. [0637]
  • example: 36.2.5 [0638]
  • (29) tag: <longitude>[0639]
  • attribute: none [0640]
  • content: the longitude of the point is written. [0641]
  • example: 133.33.36 [0642]
  • (30) tag: <altitude>[0643]
  • attribute: none [0644]
  • content: the altitude of the point is written. [0645]
  • example: 100 m [0646]
  • (31) tag: <cost>[0647]
  • attribute: none [0648]
  • content: cost required at the point, such as an admission fee, is written. [0649]
  • example: 540 yen [0650]
  • (32) tag: <comment>[0651]
  • attribute: none [0652]
  • content: a comment on the point is written. [0653]
  • (g) a description below <inst> or <naviscript>[0654]
  • [continued from (e)][0655]
  • (33) tag: <location>[0656]
  • “location” indicates a position at which navigation is performed. “location” relatively stipulates a position. [0657]
  • attribute: [0658]
  • id—an ID for making an internal or external reference is assigned. [0659]
  • ref—an internal or external <location> is referenced by describing the ID assigned to the <location>. [0660]
  • content: a position at which navigation is performed is written. A position can be relatively specified as follows. [0661]
  • <location>+1.0 km </location>[0662]
  • Relative position specification “1 km after a preceding point”. [0663]
  • <location>−1.0 km <location>[0664]
  • Relative position specification “1 km before a succeeding point”. [0665]
  • (34) tag: <object>[0666]
  • “object” indicates an object to be navigated such as a building. [0667]
  • attribute: [0668]
  • id—an ID for making an internal or external reference is assigned. [0669]
  • example: id=“object-Rainbow Bridge”[0670]
  • ref—an internal or external <object> is referenced by describing the ID assigned to the <object>[0671]
  • example: ref=“object-cafe”[0672]
  • content: the following tag sets can be included. <name>, <category>, <address>, <zip-code>, <country>, <phone>, <fax>, <url>, <e-mail>, <latitude>, <longitude>, <altitude>, <open>, <close>, <reservation>, <comment>, <text>, <voice>, <audio>, <image>, <video>[0673]
  • These tag sets can be recognized as the elements for stipulating an object such as a facility to be navigated. [0674]
  • (h) a description below <object>[0675]
  • (35) tag: <name>[0676]
  • attribute: none [0677]
  • content: the name of the object is written. [0678]
  • example: Restaurant Fujitsu [0679]
  • (36) tag: <category>[0680]
  • attribute: none [0681]
  • content: the category of the object is written. [0682]
  • example: restaurant, Italian, [0683]
  • (37) tag: <address>[0684]
  • attribute: none [0685]
  • content: The address of the object is written. [0686]
  • example: 9-9-9, Daiba, Minato Ward, Tokyo [0687]
  • (38) tag: <zip-code>[0688]
  • attribute: none [0689]
  • content: the zip code of the object is written. [0690]
  • example: 012-3456 [0691]
  • (39) tag: <country>[0692]
  • attribute: none [0693]
  • content: the name of the country to which the object belongs is written. [0694]
  • example: Japan [0695]
  • (40) tag: <phone>[0696]
  • attribute: none [0697]
  • content: the telephone number of the object is written. [0698]
  • example: 987-654-3210 [0699]
  • (41) tag: <fax>[0700]
  • attribute: none [0701]
  • content: the fax number of the object is written. [0702]
  • example: 999-999-9999 [0703]
  • (42) tag: <url>[0704]
  • attribute: none [0705]
  • content: the web page address (URL: Uniform Resource Locator) relating to the object is written. [0706]
  • example: http://www.fujisan-tv.com/ [0707]
  • (43) tag: <e-mail>[0708]
  • attribute: none [0709]
  • content: the e-mail address relating to the object is written. [0710]
  • example: www@fujisan-tv.com [0711]
  • (44) tag: <latitude>[0712]
  • attribute: none [0713]
  • content: the latitude of the object is written: [0714]
  • example: 36.3.5 [0715]
  • (45) tag: <longitude>[0716]
  • attribute: none [0717]
  • content: the longitude of the object is written. [0718]
  • (46) tag: <altitude>[0719]
  • attribute: none [0720]
  • content: the altitude of the object is written. [0721]
  • example: 999 m [0722]
  • (47) tag: <open>[0723]
  • attribute: none [0724]
  • content: open days of the week and open time of the object are written. [0725]
  • example: Monday through Friday, 10:00-17:00 [0726]
  • (48) tag: <close>[0727]
  • attribute: none [0728]
  • content: closed days of the week and closed time of the object are written. [0729]
  • example: Saturday, Sunday, holidays [0730]
  • (49) tag: <reservation>[0731]
  • attribute: none [0732]
  • content: whether or not a reservation for the object is required is written. [0733]
  • example: reservation required [0734]
  • (50) tag: <comment>[0735]
  • attribute: none [0736]
  • content: a comment on the object is written. [0737]
  • (51) tag: <text>[0738]
  • attribute: [0739]
  • duration—a time period during which “text” is displayed. [0740]
  • content: “text” displayed as one form of object navigation is written in text. [0741]
  • example: Specialty is . . . made by an Italian chef. [0742]
  • (52) tag: <voice>[0743]
  • attribute: [0744]
  • duration—a time period during which “voice” is output. [0745]
  • times—the number of times that “voice” is output. [0746]
  • content: “voice” output as one form of object navigation is written in text. [0747]
  • example: Specialty is . . . made by an Italian chef. [0748]
  • (53) tag: <audio>[0749]
  • attribute: [0750]
  • src—an “audio” file output as one form of object navigation is specified. [0751]
  • duration—a time period during which “audio” is output. [0752]
  • content: none [0753]
  • (54) tag: <image>[0754]
  • attribute: [0755]
  • src—an “image” file displayed as one form of object navigation is specified. [0756]
  • duration—a time period during which an “image” is displayed. [0757]
  • content: none [0758]
  • (55) tag: <video>[0759]
  • attribute: [0760]
  • src—a “video” file played as one form of object navigation is specified. [0761]
  • duration—a time period during which a “video” is played. [0762]
  • content: none [0763]
  • (i) a description below <inst> or <naviscript> [continued from (g)]. [0764]
  • (56) tag: <route>[0765]
  • “route” indicates a route to be navigated. attribute: [0766]
  • id—an ID for making an internal or external reference is assigned. [0767]
  • ref—an internal or external <route> is referenced by describing the ID assigned to the <route>. [0768]
  • content: The following tag sets can be included. [0769]
  • <means>, <name>, <category>, <cost>, <comment>[0770]
  • These tag sets can be recognized as the elements for stipulating a route to be navigated. Or, the following content can be written. [0771]
  • the same—this indicates that the information of a route from a current point to the next point is the same as that of the route from the preceding point to the current point. [0772]
  • (j) a description below <route>[0773]
  • (57) tag: <means>[0774]
  • attribute: none [0775]
  • content: a method for moving the route is written. [0776]
  • example: walk, bicycle, car, bus, train, ship, plane, . . . [0777]
  • (58) tag: <name>[0778]
  • attribute: none [0779]
  • content: the name of the route is written. [0780]
  • example: [0781] Route 1
  • (59) tag: <category>[0782]
  • attribute: none [0783]
  • content: the category of the route is written. [0784]
  • example: a normal road, a toll road, a highway, an esplanade, . . . [0785]
  • (60) tag: <cost>[0786]
  • attribute: none [0787]
  • content: the cost required on the route is written. [0788]
  • example: 540 yen [0789]
  • (61) tag: <comment>[0790]
  • attribute: none [0791]
  • content: a comment on the route is written. [0792]
  • (k) a description below <inst> or <naviscript>[continued from (i)][0793]
  • (62) tag: <info>[0794]
  • “info” indicates information to be navigated. attribute: [0795]
  • ref—an internal or external <info> is referenced by describing the ID assigned to the <info>. [0796]
  • example: ref=object-Rainbow Bridge#info [0797]
  • content: the following tag sets can be included. [0798]
  • <seq> or <par>[0799]
  • (1) a description below <info>[0800]
  • (63) tag: <seq>[0801]
  • “seq” indicates “sequential”. <seq> indicates that included items are executed sequentially. [0802]
  • attribute: none [0803]
  • content: the following tag sets or an arbitrary combination of an arbitrary number of them can be included. [0804]
  • <text>, <voice>, <audio>, <image>, <video>[0805]
  • (64) tag: <par>[0806]
  • “par” indicates “parallel”. <par> indicates that included items are executed in parallel. <par> is defined to be a default setting in a portion below <info>, and thus <par> can be omitted. [0807]
  • attribute: none [0808]
  • content: the following tag sets or an arbitrary combination of an arbitrary number of them can be included. [0809]
  • <text>, <voice>, <audio>, <image>, <video>[0810]
  • (m) a description below <seq> or (par> below <info>, or just below <info>[0811]
  • (65) tag: <text>[0812]
  • attribute: [0813]
  • ref—an internal or external <text> is referenced by describing the ID assigned to the <text>. [0814]
  • example: ref=“object-restaurant#text”[0815]
  • duration—a time period during which “text” is displayed. [0816]
  • content: “text” displayed as one form of navigation is written in text. [0817]
  • example: Welcome to Rainbow Town Tour![0818]
  • (66) tag: <voice>[0819]
  • attribute: [0820]
  • ref—an internal or external <voice> is referenced by describing the ID assigned to the <voice>. [0821]
  • example: ref=“object-restaurant#voice”[0822]
  • duration—a time period during which “voice” is output. [0823]
  • content: “voice” output as one form of navigation is written in text. [0824]
  • example: Hope you enjoyed this tour![0825]
  • (67) tag: <audio>[0826]
  • attribute: [0827]
  • ref—an internal or external <audio> is referenced by describing the ID assigned to the <audio>. [0828]
  • example: ref=“object-restaurant#audio”[0829]
  • src—an “audio” file output as one form of navigation is specified. [0830]
  • duration—a time period during which “audio” is output. [0831]
  • content: none [0832]
  • (68) tag: <image>[0833]
  • attribute: [0834]
  • ref—an internal or external <image> is referenced by describing the ID assigned to the <image>. [0835]
  • exampl: ref=“object-restaurant#image”[0836]
  • src—an “image” file displayed as one form of navigation is specified. [0837]
  • duration—a time period during which “image” is displayed. [0838]
  • content: none [0839]
  • (69) tag: <video>[0840]
  • attribute: [0841]
  • ref—an internal or external <video> is referenced by describing the ID assigned to the <video>. [0842]
  • example: ref=“object-restaurant#video”[0843]
  • src—a “video” file reproduced as one form of navigation is specified. [0844]
  • duration—a time period during which “video” is reproduced. [0845]
  • content: none [0846]
  • The above described specification of a naviscript language is merely one example, and can be easily expanded and changed when being designed. [0847]
  • Characteristics of the Preferred Embodiments
  • The characteristics of the preferred embodiments are listed below. [0848]
  • 1. Naviscript Description Methods
  • (01) A naviscript is written by using a sequence of instructions which include as a constituent element a presentation time, or both of a presentation time and information for guidance to be output at that time. [0849]
  • (02) A naviscript is written by using a sequence of instructions which include a constituent element a point to be reached, or both of a point to be reached and information for guidance to be output at that point. [0850]
  • (03) A naviscript is written by using a sequence of instructions which include as a constituent element a presentation time or both of a presentation time and information for guidance to be output at that time, and/or a point to be reached or both of a point to be reached and information for guidance to be output at that place. [0851]
  • (04) A naviscript can describe that a plurality of instructions are processed sequentially or in parallel, and that navigation is made in an optimum order of required durations, distances, or costs of a plurality of instructions, or an order specified by a compound combination of the instructions in the above described methods (01), (02), and (03). [0852]
  • (05) A naviscript can specify respective times by using a time range obtained by an arbitrary combination of an absolute time like “10:00”, a relative time like “10 minutes after”, or “a time at or before . . . ”, a time at or after . . . ”, “a time before . . . ”, and “a time after . . . ” in the above described methods (01), (03), and (04). [0853]
  • (06) A naviscript can specify respective point by using a point range obtained by an arbitrary combination of an absolute point (for example, coordinates such as latitude, longitude, and altitude, or a proper attribute of an object which can indirectly identify a point, such as a name, an address, a telephone number, etc.), a relative point (such as “10 km beyond . . . ”), a point range (such as “within a radius of 10 km”), a point range such as an attribute of abstract concept which can indirectly identify a point (such as a name, an address, a zip code, etc.), or “inside . . . ”, “outside . . . ”, “within . . . ” and “beyond . . . ”) in the above described (02), (03), and (04). [0854]
  • (07) A naviscript can specify a route or a track, which is a point transition with time, by using an arithmetic function, a separately defined function, or separately specified data, or an arbitrary combination of the functions and data in the above described methods (02), (03), and (04). [0855]
  • (08) A naviscript can specify a condition of whether or not to execute each instruction by describing whether or not a navigation provider/providing apparatus, a navigation user/using apparatus, information about navigation contents, information about a move method, and information about peripheral situation, or their combination is equal to a certain value or belongs to a certain range (set) in the above described methods (01), (02), (03), and (04). [0856]
  • (09) A naviscript can describe also a variety of external information such as a facility, an object, an event (such as a concert, an exhibition, etc.), a timetable, etc., which relate to a presentation time, a presentation time and navigation information to be output at that time, a point to be reached, and a point to be reached and information for guidance to be output at that point, by specifying their locations with network addresses, etc. in the above described methods (01), (02), (03), and (04). [0857]
  • (10) A naviscript can specify as a navigation outputting means characters, a map, voice, music, an image, a video, light, smell, force, and a movement, or their arbitrary combination in the above described methods (1), (02), (03), and (04). [0858]
  • (11) A naviscript can describe the items relating to the summary of navigation in the above described methods (01), (02), (03), and (04). [0859]
  • (12) A naviscript is written so that identifiers like tags are assigned to respective items such as a time, a point, navigation information, summary of navigation, etc. in the above described methods (01), (02), (03), (04), and (11). [0860]
  • 2. Naviscript Generation Method
  • (01) Respective types of move data such as times, places, etc. of a user or a move method, and/or respective types of media data such as voice, music, an image, a video, etc. are sampled, so that a part or the whole of the above described naviscript is semi-automatically generated in a discrete or a continuous manner. [0861]
  • 3. Naviscript Storing Methods
  • (01) A portion or the whole of the naviscript is stored by being assigned a unique number or name which can identify data. [0862]
  • (02) A portion or the whole of the naviscript is classified and stored according to an item relating to the summary of the navigation described therein. [0863]
  • (03) With respect to a portion or the whole of the naviscript, only one principal script is stored and a link to the principal script is stored for each classification. [0864]
  • (04) The language specification and available tags of a portion or the whole of the naviscript are registered to a particular location on a network, and a limitation is imposed so that only the registered tags are utilized by all users, in order to facilitate the distribution of the naviscript. [0865]
  • 4. Naviscript Retrieving Methods
  • (01) A desired naviscript is retrieved with a given keyword using contents, which are obtained by excluding tags included in a portion or the whole of the naviscript, as a target. [0866]
  • (02) A desired naviscript is retrieved with a given keyword using only contents, which relate to a particular tag (set) included in a portion or the whole of the naviscript, as a target. [0867]
  • (03) A desired naviscript is retrieved by using as keys one or a plurality of tags and the contents relating to the tags in a portion or the whole of the naviscript. [0868]
  • (04) Tags of items relating to the summary of the navigation in a portion or the whole of the naviscript are used as the tags referred to in the above described method (3). [0869]
  • 5. Naviscript Creating System (Editor)
  • (01) The naviscript creating system creates a portion or the whole of the naviscript by dividing one or a plurality of naviscripts into respective portions in units of instruction, and/or by merging the portions. [0870]
  • (02) The naviscript creating system assists in attaching tags in a portion or the whole of the naviscript based on the specification of the naviscript language by automatically complementing the name of a tag when characters at the beginning of the tag or its abbreviation are input, or by automatically attaching a tag with a selection from a menu. [0871]
  • (03) The naviscript creating system arranges a portion or the whole of the naviscript in a hierarchical structure based on the specification of the naviscript language, and displays the naviscript. [0872]
  • (04) The naviscript creating system has a parsing capability and a debugging capability, which are intended to check a portion or the whole of the naviscript, to indicate a grammatically erroneous portion to a user, and to automatically modifies the erroneous portion based on the specification of the naviscript language. [0873]
  • (05) The naviscript creating system has an inputting unit for inputting from a map information system to a portion or the whole of the naviscript the information about a place including latitude, longitude, altitude, object attributes such as a name, an address, a telephone number, etc., and/or navigation information accompanying the place via a buffer or a file, and an outputting unit for outputting the portion or the whole of the naviscript via a buffer or a file. [0874]
  • (06) The naviscript creating system has a loading/saving unit for loading/saving a portion or the whole of the naviscript to/in a local file system and/or a network file system. [0875]
  • (07) The naviscript creating system has a storing unit for storing a portion or the whole of the naviscript by using the above described storing methods for the naviscript. [0876]
  • 6. Naviscript Conversion Method
  • (01) Instructions are extracted from a portion or the whole of the naviscript, and the extracted instructions are converted into structured navigation data. [0877]
  • (02) Instructions are extracted from a portion or the whole of the naviscript, particular flags are attached to the respective contents of a time, place, and navigation information, and these contents are converted into structured navigation data so as to allow the distinction from the respective contents of a time, place, and navigation information added/modified by the system to be made. [0878]
  • (03) An item relating to a time included in an instruction in a portion or the whole of the naviscript is converted into that of another naviscript for describing time information (for example, for a scheduler, etc.) [0879]
  • (04) A description relating to a place (a point or a route) included in an instruction in a portion or the whole of the naviscript is converted into that of another naviscript for describing the place or a map. [0880]
  • (05) A description relating to an output medium in a portion or the whole of the naviscript is converted into that of another naviscript for an output of the output medium. [0881]
  • (06) Respective types of parameters in a portion or the whole of the naviscript, which are required by a naviscript resultant from conversion, are specified by a description within the naviscript, a default value, another specification file, a user menu selection, or their combination. [0882]
  • (07) The naviscript is converted into a form of an article in a travel advertisement or an informative magazine. [0883]
  • (08) The naviscript is converted into a form of a program or a commercial of television or radio. [0884]
  • 7. Naviscript Converting System (Translator)
  • (01) The naviscript converting system comprises a converting unit for converting a portion or the whole of the naviscript with the above described conversion methods. [0885]
  • 8. Navigation Instruction Executing Method
  • (01) An instruction extracted from a naviscript is executed with the process algorithm shown in FIGS. 7 and 8, or with the process algorithm a portion of which is omitted. [0886]
  • 9. Navigation Instruction Executing System (Processor)
  • (01) The navigation instruction executing system executes an instruction extracted from a naviscript with the process algorithm shown in FIGS. 7 and 8 or with the process algorithm a portion of which is omitted. [0887]
  • 10. Navigation Output Methods Based on a Naviscript
  • (01) Navigation is performed for a current point, a departure point, en-route spots, a destination, a route, etc., one after another, or according to each instruction in a portion or the whole of the naviscript. Or, navigation is performed for a certain time, distance, or point, or with an input operation or according to an external event. [0888]
  • (02) With respect to a portion or the whole of the naviscript, navigation such as straight-going, right-turn, or left-turn is performed at an intersection or a particular point. [0889]
  • (03) With respect to a portion or the whole of the naviscript, the entire contents such as a departure point, en-route spots, a destination, a route, etc. are presented. [0890]
  • (04) With respect to a portion or the whole of the naviscript, navigation is performed by switching or combining the above described methods (01), (02), and (03). [0891]
  • (05) With respect to a portion or the whole of the naviscript, navigation is performed by using characters, a map, voice, music, an image, a video, light, smell, force, movement, etc. [0892]
  • (06) A combination or a switching selection in the above described methods (04) and (05) is specified by a description in a portion or the whole of the naviscript, a default value, another specification file, or a user menu selection, or their combination. [0893]
  • 11. Navigation Outputting System Based on a Naviscript (Browser)
  • (01) The navigation outputting system output navigation with the above described navigation output methods based on the naviscript. [0894]
  • 12. Naviscript Providing System and Using System
  • (01) The naviscript providing system (a server or a center) provides to a terminal via a network or an electronic medium a naviscript which describes an information sequence of various points and a route such as a recommended date spot, sightseeing course, etc., and navigation information accompanying the information sequence (such as facility information, a right-turn or left-turn directive, etc.), and makes the terminal perform navigation according to the naviscript. [0895]
  • (02) The naviscript using system (a client or a terminal) obtains a naviscript being an information sequence describing various spots or a route such as a recommended date spot, sight-seeing course, etc. and navigation information accompanying the information sequence (such as facility information, a right-turn or left-turn directive, etc.) via a network or an electronic medium, and performs navigation according to the obtained naviscript. [0896]
  • (03) The naviscript providing system uses a portion or the whole of the naviscript as the naviscript used in the above described (01) or (02). [0897]
  • (04) The naviscript providing system uses a portion or the whole of the naviscript as the naviscript used in the above described (01) or (02). [0898]
  • (05) The naviscript providing system and using system are configured by using the above described naviscript conversion methods, the above described naviscript converting system (translator), the above described navigation instruction execution method, the above described navigation instruction executing system (processor), the navigation output method based on the naviscript, and the above described navigation outputting system (browser). [0899]
  • (06) The naviscript providing system and using system have a navigation mode and/or a simulation mode as navigation modes. In the navigation mode, the system obtains the information about an actual current time/point from a state acquiring unit, and performs actual navigation according to the obtained information. In the simulation mode, the system obtains the information about a virtual current time/point, and performs virtual navigation according to the obtained information. [0900]
  • (07) The naviscript providing system and using system have a state generating unit for initializing and stopping a virtual current time or moving a virtual current time forward or backward, and for allowing the forward or the backward moving speed to be changed during simulation in the simulation mode. [0901]
  • (08) The naviscript providing system and using system change the contents of navigation, an output method, and/or a display method according to the type of a using device. [0902]
  • 13. Various Service Types Using Naviscripts
  • (01) The naviscript providing system and using system can be used for route navigation, sightseeing navigation and guidance, a delivery plan, a travel plan, traffic control, scheduling, an amusement, a municipal service, etc. [0903]
  • (02) The systems can be used for driving management such that a driving management center returns a reserved and modified naviscript by receiving a naviscript describing an itinerary/route that a user desires from an information device such as a car navigation system, and by making a comparison/coordination between the naviscript and the data stored in a driving management database. [0904]
  • (03) The systems automatically determine whether or not a user can reach by an arrival time, and proposes an action to be taken by the user, so that a time adjustment during a move can be made when navigation information using a naviscript is presented. [0905]
  • (04) The systems attach navigation information to areas on a map, automatically capture the navigation information about a route when a user selects the route on the map, and can easily create a navigation plan with a naviscript obtained by combining navigation information. [0906]
  • (05) An information providing system can be configured so that also a range or a relatively specified time/point condition can be processed for the information with the time/point presentation condition, which is described by a naviscript, and suitable information is transmitted to a user or users corresponding to the time/point condition. [0907]
  • Effects and Applications of the Preferred Embodiments
  • As described above, the following effects can be obtained with the preferred embodiments according to the present invention. [0908]
  • (a) A navigation service can be provided/used to/by various users offline or online with various types of devices/media at the same time or different times in the same point or different points. [0909]
  • (b) A naviscript can be executed, converted, created, edited, divided, merged, changed, modified, copied, deleted, stored, and retrieved. It can be also formed into a database for reuse. Additionally, the naviscript can be carried or transferred by a suitable electronic medium or a network. Furthermore, it can be sold, purchased, issued, received, given, taken, thrown away, picked up, value-added (such as a mileage service), etc. [0910]
  • (c) Specifically, the naviscript can be created and provided by anybody such as a naviscript center, a contents provider, each facility, an individual, a group, etc. [0911]
  • (d) Additionally, the naviscript can be used by various devices/media such as a PC, a car navigation system, PDA, PDC, PHS, an IC card, a prepaid card, a magnetic disk, an optical disk, a bar code, paper, etc. [0912]
  • (e) Especially, a PC, etc. can be used as the naviscript providing device, while a PC, a car navigation system, PDA, PDC, PHS, and the like can be used as the naviscript using device. [0913]
  • (f) A naviscript created by a PC, a car navigation system, PDA, PDC, PHS, etc. can be written to an IC card or a prepaid card. Additionally, the naviscript written to the IC card or the prepaid card is read into the car navigation system, the PDA, the PDC, the PHS, etc., so that the instructions for a navigation service can be executed. [0914]
  • (g) The navigation received by a certain PC, car navigation system, PDA, PDC, PHS, etc. can continuously received even if the device receiving the naviscript is changed to a different PC, car navigation system, PDA, PDC, PHS, etc. For example, if a user who carries a PDA and receives navigation while walking rides in a car, the same navigation can be received from a car navigation system with the shared use of the same naviscript. Similarly, for example, a private tour can be made by riding multiple cabs in succession. If a naviscript is stored in a PDA, etc. and carried in this case, a copy of the naviscript is transmitted to a corresponding car navigation system at the time of riding in a cab. This is exactly the same at the time of a transfer to another cab. Besides, a narrative service based on navigation information can be received. [0915]
  • (h) The naviscript being received by a certain PC, car navigation system, PDA, PDC, PHS, etc. can be received also by another PC, car navigation system, PDA, PDC, PHS, etc. Therefore, for example, a plurality of private cars can make the same tour. Namely, one naviscript can be shared by many people. [0916]
  • (i) The naviscript can be carried or dealt by an IC card or a prepaid card. By way of example, with an IC card storing a naviscript and electronic money, a ticket can be purchased when inserting the IC card into an automatic ticket vending machine, and a seat and also a hotel can be reserved at a ticket center. Since these information are added to the naviscript, navigation is performed not only from the station to the seat but also check-in at the hotel can be easily made. Furthermore, if the naviscript is read into a terminal in a room, it becomes possible to make the terminal care for you. For example, “Is it OK to wake you up at ◯ o'clock tomorrow?”, etc. is uttered according to the schedule. [0917]
  • (j) Prepaid cards describing various naviscripts such as a historic spots tour, a route on which a famous person passed, impressed movie scenes tour, etc. can be sold. Also shop advertisements or a movie schedule can be included therein. Furthermore, restaurant information, etc. can be included, which facilitates reservation making of a restaurant. Note that the cards may only include the principal part of a naviscript, and quoted voice or image data may be stored at a source in a network. [0918]
  • (k) For example, a tissue paper including a naviscript prepaid card is distributed in front of a station, or a name card is formed as a naviscript prepaid card, so that the advertised shop or company can be easily reached. [0919]
  • (l) A naviscript can be written/read as a bar code. [0920]
  • (m) A course introduced by television can be downloaded from the Internet as a naviscript, or a demo tape of a course carried in a magazine or a guide book can be viewed with an input from an attached bar code or CD-ROM. [0921]
  • (n) Since a naviscript can be described as text data, it can be written/read to/from paper, plastic, etc. For example, a paper sheet on which a naviscript describing the procedures for reaching a treasure spot can be stored in a particular place like a bank, similar to cash. [0922]
  • (o) A naviscript can be applied in various ways. As navigation within a building, a natural world, or a virtual world, for example, indoor navigation using an elevator or an escalator, navigation in a skiing area or a golf course, navigation for going down a stream or for scuba diving, an experience of a simulated sightseeing flight or space trip, navigation in a virtual shopping mall, etc. can be performed. [0923]
  • (p) Also a naviscript allows navigation in the past or in the future. The navigation can be applied to, for example, navigation for Tokaido, an explanation about an invasion, a battle, or a war, an explanation about scenery viewed from the Silk Road or a car/train window in the world, a flashback of a movie such as the Titanic, creation of a logbook or a travel album, etc. [0924]
  • (q) With a naviscript, the flow of an animal, goods, substance, and information are made visible. For example, a move of a passage bird, an image of a driving simulation of a train or a bus, movement of a weather satellite, an explanation about the flow of barter or trade, a mail transmission path, etc. can be displayed. [0925]
  • (r) It is possible to make a naviscript easy to be read and written by realizing it with a standard mark-up language. [0926]
  • (s) Since a naviscript is fundamentally text data, a storage medium with a small capacity is sufficient. Also the execution of its instructions can be realized with ease. [0927]
  • As described above, according to the present invention, navigation with various types of data can be experienced by various types of devices, systems, and media in various places. Additionally, navigation can be virtually received by setting a virtual time or place. Note that the present invention is not limited to navigation for a route such as a road, etc. The present invention can be also applied to navigation in a virtual space/time world, visualization of a travel course of an animal, transportation facilities, a weather satellite, etc., a display of barter, a display of a transmission path of mail, etc. [0928]

Claims (45)

What is claimed is:
1. A navigation information presenting apparatus or presenting navigation information to a user according to a state, comprising:
means for inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation place can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
means for performing one of state acquisition of acquiring a state including at least one of a current time and a current point, and state generation of generating a state including at least one of a virtual current time and a virtual current point;
means for processing instructions described in the input navigation script according to at least one of a current time and a current point, which is obtained by one of the state acquisition and the state generation; and
means for outputting navigation information to be output as the instructions are processed, and for presenting the navigation information to the user.
2. The navigation information presenting apparatus according to
claim 1
, wherein the navigation script is described in a markup language which identifies the time information, the point information, the information for guidance, and other instruction constituent elements by using tags.
3. The navigation information presenting apparatus according to
claim 1
, wherein:
the navigation script can describe a sequential process instruction for processing a plurality of instructions sequentially and a parallel process instruction for processing a plurality of instructions in parallel; and
said means for processing instructions processes the plurality of instructions sequentially according to the sequential process instruction, and processes the plurality of instructions according to the parallel process instruction.
4. The navigation information presenting apparatus according to
claim 1
, wherein
said means for inputting the navigation script inputs a navigation script specified by the user with one of methods such as a communication with an external device providing a navigation script via a network, a read from a computer-readable electronic medium, and an input from an input device operated by the user.
5. The navigation information presenting apparatus according to
claim 1
, further comprising
means for parsing the input navigation script, and for converting the input navigation script into structured navigation data, wherein
said means for processing instructions processes instructions represented in the form of the structured navigation data.
6. The navigation information presenting apparatus according to
claim 1
, wherein said means for outputting navigation information presents to the user, with respect to at least a portion of the navigation script, information for guidance about at least one of a current point, a departure point, an en-route spot, a destination, and a route by using at least one of texts, maps, voice, images, and videos with one of a method for presenting the navigation information one after another, a method for presenting the navigation information for each instruction, and a method for presenting the navigation information for at least one of specified time, distance, point, input operation, and external event.
7. A navigation information presenting apparatus for presenting navigation information to a user according to a state, comprising:
means for inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
means for setting at least one of operation modes such as a navigation mode and a simulation mode according to at least one of a user input operation and a system setting;
means for acquiring a state including at least one of a current time and a current point in the navigation mode;
means for generating a state including at least one of a virtual current time and a virtual current point in the simulation mode;
means for processing instructions described in the input navigation script according to at least one of the current time and the current point, which is obtained by state acquisition, in the navigation mode, and processes the instructions according to at least one of the virtual current time and the virtual current point, which is obtained by state generation, in the simulation mode; and
means for outputting navigation information to be output as the instructions are processed, and for presenting the navigation information to the user.
8. A navigation information presentation processing method for presenting navigation information to a user according to a state, comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof, with one of methods such as a communication via a network, a read from a computer-readable electronic medium, and a user input operation;
a performing one of state acquisition of acquiring a state including at least one of a current time and a current point, and state generation of generating a state including at least one of a virtual current time and a virtual current point;
processing instructions described in the input navigation script according to at least one of a current time and a current point, which is obtained by one of the state acquisition and the state generation; and
outputting navigation information to be output as the instructions are processed, and presenting the navigation information to the user.
9. A navigation information presenting method for presenting navigation information to a user according to a state, comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
setting at least one of operation modes such as a navigation mode and a simulation mode according to at least one of a user input operation and a system setting;
acquiring a state including at least one of a current time and a current point in the navigation mode;
generating a state including at least one of a virtual current time and a virtual current point in the simulation mode;
processing instructions described in an input navigation script according to at least one of the current time and the current point, which is obtained by state acquisition, in the navigation mode, and processes the instructions according to at least one of the virtual current time and the virtual current point, which is obtained by state generation, in the simulation mode; and
outputting navigation information to be output as the instructions are processed, and for presenting the navigation information to the user.
10. A computer-readable storage medium on which is recorded a program for implementing with a computer an apparatus which presents navigation information to a user according to a state, and for causing the computer to execute a process comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and place information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify the type of each of the information and the contents thereof, with one of methods such as a communication via a network, a read from a computer-readable electronic medium, and a user input operation;
performing one of state acquisition of acquiring a state including at least one of a current time and a current point, and state generation of generating a state including at least one of a virtual current time and a virtual current point;
processing instructions described in the input navigation script according to at least one of a current time and a current point, which is obtained by one of the state acquisition and the state generation; and
outputting navigation information to be output as the instructions are processed, and presenting the navigation information to the user.
11. A computer-readable storage medium on which is recorded a program for implementing with a computer an apparatus which presents navigation information to a user according to a state, and for causing the computer to execute a process comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
setting at least one of operation modes such as a navigation mode and a simulation mode according to at least one of a user input operation and a system setting;
acquiring a state including at least one of a current time and a current point in the navigation mode;
generating a state including at least one of a virtual current time and a virtual current point in the simulation mode;
processing instructions described in the input navigation script according to at least one of the current time and the current point, which is obtained by state acquisition, in the navigation mode, and processes the instructions according to at least one of the virtual current time and the virtual current point, which is obtained by state generation, in the simulation mode; and
outputting navigation information to be output as the instructions are processed, and for presenting the navigation information to the user.
12. A navigation script storage medium which can be read by an apparatus for presenting navigation information to a user according to a state, storing
a navigation script being a script composed of an electronic code sequence described in a markup language based on a predetermined specification, the script being an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof, and the navigation script used to present navigation information to the user according to instructions within the script read by the apparatus.
13. A device for semi-automatically generating a navigation script used by an apparatus for presenting navigation information to a user according to a state, comprising:
means for acquiring a state including at least one of a current time and a current point with one of a method for acquiring a state at predetermined time intervals, a method for acquiring a state for each point, a method for acquiring a state at predetermined distance intervals, and a method for acquiring a state according to a user instruction;
means for generating a navigation script composed of an instruction sequence based on a predetermined specification, with which at least one of time information and place information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof, based on a history of the acquired state including at least one of each acquired time and each acquired point; and
means for storing the generated navigation script in an electronic medium.
14. A device for managing driving data, comprising:
inputting means for inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation place can be described using a set of combinations of a name which can identify a type of each of the information and the contents thereof;
a driving management database for managing data where at least one of time information and information, and at least one of a reservation state and a corresponding point state are described;
coordinating means for making a comparison and coordination between the navigation script input by said inputting means and the data stored in said driving management database, and for performing a process of modifying the navigation script and a process of updating the data stored in the driving management database according to a result of the comparison and coordination depending on need; and
outputting means for outputting a resultant navigation script.
15. A driving management method using a computer, comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and navigation information to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
referencing a driving management database for managing data where at least one of time information and point information, and at least one of a reservation state and a corresponding point state are described, for making a comparison and coordination between the input navigation script and the data stored in said driving management database, and for performing a process of modifying the navigation script and a process of updating the data stored in the driving management database according to a result of the comparison and coordination depending on need; and
outputting means for outputting a resultant navigation script.
16. A computer-readable storage medium on which is recorded a program for implementing with a computer a device for managing driving data, and for causing the computer to execute a process comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, with which at least one of time information and point information, and navigation information to be output according to at least one of a presentation time and a presentation place can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
referencing a driving management database for managing data where at least one of time information and point information, and at least one of a reservation state and a corresponding place state are described, for making a comparison and coordination between the input navigation script and the data stored in said driving management database, and for performing a process of modifying the navigation script and a process for updating the data stored in the driving management database according to a result of the comparison and coordination depending on need; and
outputting a resultant navigation script.
17. A device for proposing an action to be taken by a user depending on whether or not the user can reach by an arrival time, comprising:
inputting means for inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation place can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
a scheduler for scheduling arrival times at respective places;
a rule base for storing rules describing actions to be taken depending on whether or not there is time sufficient to an arrival time; and
a monitoring and executing device for checking arrival times at subsequent points from a current time at a current point, and for executing a corresponding rule if the rule is stored within said rule base, depending on whether or not there is sufficient time to an arrival time.
18. A method for proposing an action to be taken by a user depending on whether or not the user can reach by an arrival time, comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation place can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
scheduling arrival times at respective points; and
checking arrival times at subsequent points from a current time at a current point, for referencing rules in a rule base describing actions to be taken depending on whether or not there is time to spare by an arrival time, and for executing a corresponding rule if the rule is stored within said rule base, depending on whether or not there is sufficient time to an arrival time.
19. A computer-readable storage medium on which is recorded a program for implementing with a computer a device which proposes an action to be taken depending on whether or not a user can reach by an arrival time, and for causing the computer to execute a process comprising the steps of:
inputting a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and navigation information to be output according to at least one of a presentation time and a presentation place can be described using a set of combinations of a name which can identify a type of the information and the contents thereof;
scheduling arrival times at respective points; and
checking arrival times at subsequent points from a current time at a current point, for referencing rules in a rule base describing actions to be taken depending on whether or not there is sufficient time to an arrival time, and for executing a corresponding rule if the rule is stored within said rule base, depending on whether or not there is sufficient time to an arrival time.
20. A navigation plan creating apparatus for creating a navigation plan obtained by combining information for guidance, comprising:
means for associating one of a range and a point on map information with information for guidance;
means for setting a specified route on the map information; and
means for creating a navigation plan by extracting the information for guidance associated with the set route.
21. The navigation plan creating apparatus according to
claim 20
, wherein
the navigation plan is represented by a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof.
22. The navigation plan creating apparatus according to
claim 20
, further comprising
an information for guidance management database for managing the information for guidance associated with the map information for each attribute including at least one of season, time, and a user type, wherein
said means for creating the navigation plan creates a navigation plan suiting a specified attribute by referencing said information for guidance management database.
23. The navigation plan creating apparatus according to
claim 20
, wherein:
said means for associating the one of the range and the point on the map information with the information for guidance can specify one of a valid time period and a valid time limit when associating with the information for guidance; and
said means for creating the navigation plan performs one of a process of creating a navigation plan by extracting one of only information for guidance within the valid time period and only information for guidance up to the valid time limit when extracting the information for guidance associated with the set route, and a process of creating a navigation plan having a condition relating to one of the valid time period and the valid time limit.
24. The navigation plan creating apparatus according to
claim 20
, wherein:
said means for associating the one of the range and the point on the map information with the information for guidance can specify a state including at least one of a direction in which a range is entered and a speed at which the range is entered when specifying the range; and
said means for creating the navigation plan performs one of a process of creating a navigation plan by extracting only information satisfying a condition including at least one of a direction in which a range on the set route is entered and a speed at which the range is entered when extracting the information for guidance associated with the route, and a process for creating a navigation plan having a condition relating to the state including at least one of the direction in which the range on the set route is entered and the speed at which the range is entered.
25. A navigation plan creating method for creating a navigation plan obtained by combining information for guidance with the use of a computer, comprising the steps of:
associating one of a range and a point on map information with information for guidance;
setting a specified route on the map information; and
creating a navigation plan by extracting the information for guidance associated with the set route.
26. A computer-readable storage medium on which is recorded a program for implementing with a computer a device which creates a navigation plan obtained by combining information for guidance, and for causing the computer to execute a process comprising the steps of:
associating one of a range and a point on map information with information for guidance;
setting a specified route on the map information; and
creating a navigation plan by extracting the information for guidance associated with the set route.
27. A navigation information providing apparatus for providing information to a user, comprising:
means for managing information with a presentation condition relating to a time;
means for checking the information with the presentation condition at predetermined time intervals, and for retrieving information corresponding to a current time; and
means for providing the retrieved information to the user.
28. The navigation information providing apparatus according to
claim 27
, further comprising
range condition processing means for determining how to present information when the current time is included in a time range if the presentation condition is a condition indicating the time range, wherein
said means for providing the information provides the information according to said range condition processing means when the presentation condition is satisfied.
29. The navigation information providing apparatus according to
claim 27
, further comprising
means for checking the information with the presentation condition by using an estimating module estimating a future state if the presentation condition is a relative condition.
30. The navigation information providing apparatus according to
claim 27
, wherein
the information with the presentation condition is represented by a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof.
31. A navigation information providing apparatus for providing information to a user, comprising:
means for managing information with a presentation condition relating to a point;
means for obtaining information about a point of a user;
means for checking the information with the presentation condition according to the obtained point of the user, and for retrieving information corresponding to the point of the user; and
means for providing the retrieved information to the user.
32. The navigation information providing apparatus according to
claim 31
, further comprising
range condition processing means for determining how to present information when the point of the user is included in a point range if the presentation condition is a condition indicating the point range, wherein
said means for providing the information provides the information according to said range condition processing means when the presentation condition is satisfied.
33. The navigation information providing apparatus according to
claim 31
, further comprising
means for checking the information with the presentation condition by using an estimating module estimating a future state if the presentation condition is a relative condition.
34. The navigation information providing apparatus according to
claim 31
, wherein
the information with the presentation condition is represented by a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof.
35. An navigation information providing apparatus for providing information to a user, comprising:
means for managing information with a presentation condition relating to a point;
means for obtaining information about a point of a user;
means for checking the information with the presentation condition, and for retrieving a user corresponding to the presentation condition; and
means for providing the information with the presentation condition to the retrieved user.
36. The navigation information providing apparatus according to
claim 35
, further comprising
range condition processing means for determining how to present information when the point of the user is included in a point range if the presentation condition is a condition indicating the place range, wherein
said means for providing the information provides the information according to said range condition processing means when the presentation condition is satisfied.
37. The navigation information providing apparatus according to
claim 35
, further comprising
means for checking the information with the presentation condition by using an estimating module estimating a future state if the presentation condition is a relative condition.
38. The navigation information providing apparatus according to
claim 35
, wherein
the information with the presentation condition is represented by a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combination of a name which can identify a type of the information and the contents thereof.
39. A navigation information providing apparatus for providing information to a user, comprising:
means for managing information with a presentation condition relating to a time and a point;
means for checking the information with the presentation condition at predetermined time intervals, and for retrieving information corresponding to a current time;
means for obtaining information about a point of a user;
means for performing one of a process of checking the information with the presentation condition according to the obtained point of the user and retrieving information corresponding to the point of the user, and a process of checking the information with the presentation condition and retrieving a user corresponding to the presentation condition; and
means for providing the retrieved information to the retrieved user.
40. The navigation information providing apparatus according to
claim 39
, further comprising
range condition processing means for determining how to present information when one of the current time and the point of the user is included in a certain range if the presentation condition is a condition indicating the certain range, wherein
said means for providing the information provides the information according to said range condition processing means when the presentation condition is satisfied.
41. The navigation information providing apparatus according to
claim 39
, further comprising
means for checking the information with the presentation condition by using an estimating module estimating a future state if the presentation condition is a relative condition.
42. The navigation information providing apparatus according to
claim 39
, wherein
the information with the presentation condition is represented by a navigation script composed of an instruction sequence based on a predetermined specification, in which at least one of time information and point information, and information for guidance to be output according to at least one of a presentation time and a presentation point can be described using a set of combinations of a name which can identify a type of the information and the contents thereof.
43. A navigation information providing method for providing information to a user, comprising the steps of:
managing information with a presentation condition relating to at least one of a time and a point;
retrieving information corresponding to at least one of a current time and a point of a user receiving the provided information with one of a method of checking the information with the presentation condition at predetermined time intervals, and a method of checking the information with the presentation condition according to the point of the user; and
providing the retrieved information to a user corresponding to the presentation condition.
44. A computer-readable storage medium on which is recorded a program for implementing with a computer a device which provides information to a user, and for causing the computer to execute a process comprising the steps of:
managing information with a presentation condition relating to at least one of a time and a point;
retrieving information corresponding to at least one of a current time and a point of a user receiving the provided information with one of a method of checking the information with the presentation condition at predetermined time intervals, and a method of checking the information with the presentation condition according to the point of the user; and
providing the retrieved information to a user corresponding to the presentation condition.
45. A navigation information presenting apparatus for presenting navigation information to a user according to a state, comprising:
means for dynamically selecting navigation information to be presented according to a change of at least one of time information and point information; and
means for outputting selected navigation information according to at least one of a time at which the selected navigation information is to be presented and a point in which the selected navigation information is to be presented.
US09/392,221 1998-04-21 1999-09-09 Apparatus and method for presenting navigation information based on instructions described in a script Expired - Lifetime US6336072B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US09/991,921 US6748316B2 (en) 1998-04-21 2001-11-26 Apparatus and method for presenting navigation information based on instructions described in a script
US09/994,003 US6697731B2 (en) 1998-11-20 2001-11-27 Apparatus and method for presenting navigation information based on instructions described in a script

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP10-330960 1998-11-20
JP33096098 1998-11-20
JP11-113191 1999-04-21
JP11319199A JP3548459B2 (en) 1998-11-20 1999-04-21 Guide information presenting apparatus, guide information presenting processing method, recording medium recording guide information presenting program, guide script generating apparatus, guide information providing apparatus, guide information providing method, and guide information providing program recording medium

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US09/991,921 Division US6748316B2 (en) 1998-04-21 2001-11-26 Apparatus and method for presenting navigation information based on instructions described in a script
US09/994,003 Division US6697731B2 (en) 1998-11-20 2001-11-27 Apparatus and method for presenting navigation information based on instructions described in a script

Publications (2)

Publication Number Publication Date
US20010056443A1 true US20010056443A1 (en) 2001-12-27
US6336072B1 US6336072B1 (en) 2002-01-01

Family

ID=26452201

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/392,221 Expired - Lifetime US6336072B1 (en) 1998-04-21 1999-09-09 Apparatus and method for presenting navigation information based on instructions described in a script
US09/991,921 Expired - Lifetime US6748316B2 (en) 1998-04-21 2001-11-26 Apparatus and method for presenting navigation information based on instructions described in a script
US09/994,003 Expired - Lifetime US6697731B2 (en) 1998-11-20 2001-11-27 Apparatus and method for presenting navigation information based on instructions described in a script

Family Applications After (2)

Application Number Title Priority Date Filing Date
US09/991,921 Expired - Lifetime US6748316B2 (en) 1998-04-21 2001-11-26 Apparatus and method for presenting navigation information based on instructions described in a script
US09/994,003 Expired - Lifetime US6697731B2 (en) 1998-11-20 2001-11-27 Apparatus and method for presenting navigation information based on instructions described in a script

Country Status (4)

Country Link
US (3) US6336072B1 (en)
EP (3) EP1003017B1 (en)
JP (1) JP3548459B2 (en)
DE (1) DE69936500T2 (en)

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010009427A1 (en) * 2000-01-25 2001-07-26 Kazuma Kaneko Navigation apparatus and recording medium providing communication between applications
US20020099734A1 (en) * 2000-11-29 2002-07-25 Philips Electronics North America Corp. Scalable parser for extensible mark-up language
US20030056175A1 (en) * 2001-08-24 2003-03-20 Masahiro Fujihara Information processing method and apparatus
US6845322B1 (en) * 2003-07-15 2005-01-18 Televigation, Inc. Method and system for distributed navigation
US20050039136A1 (en) * 2003-08-11 2005-02-17 Konstantin Othmer Systems and methods for displaying content in a ticker
US20050039135A1 (en) * 2003-08-11 2005-02-17 Konstantin Othmer Systems and methods for navigating content in an interactive ticker
US20050043883A1 (en) * 2001-03-02 2005-02-24 Kabushiki Kaisha Toshiba Image forming system and image forming apparatus
US20050075786A1 (en) * 2001-12-17 2005-04-07 Susumu Sakatani Data providing service system
US20050210391A1 (en) * 2003-08-11 2005-09-22 Core Mobility, Inc. Systems and methods for navigating content in an interactive ticker
US20060074696A1 (en) * 2002-10-15 2006-04-06 Sharp Kabushiki Kaisha Information processing device, information processing method, information processing program and medium
US20060230108A1 (en) * 2005-04-07 2006-10-12 Olympus Corporation Information display system
US20060236258A1 (en) * 2003-08-11 2006-10-19 Core Mobility, Inc. Scheduling of rendering of location-based content
US20070093957A1 (en) * 2003-10-23 2007-04-26 Shin Kikuchi Image data transmitting/receiving system, server, mobile phone terminal,program and recording medium
US20070291034A1 (en) * 2006-06-20 2007-12-20 Dones Nelson C System for presenting a navigable virtual subway system, and method for operating and using the same
US20080018659A1 (en) * 2006-07-21 2008-01-24 The Boeing Company Overlaying information onto a view for electronic display
US7343564B2 (en) * 2003-08-11 2008-03-11 Core Mobility, Inc. Systems and methods for displaying location-based maps on communication devices
US7370283B2 (en) 2003-08-11 2008-05-06 Core Mobility, Inc. Systems and methods for populating a ticker using multiple data transmission modes
US20080201071A1 (en) * 2001-03-29 2008-08-21 Gilad Odinak Vehicle navigation system and method
US20080215193A1 (en) * 2007-03-02 2008-09-04 The Boeing Company Electronic flight bag having filter system and method
US20090006994A1 (en) * 2007-06-28 2009-01-01 Scott Forstall Integrated calendar and map applications in a mobile device
US20090082958A1 (en) * 2004-11-15 2009-03-26 Pioneer Corporation Travel Route Display System
US20090276318A1 (en) * 2006-04-20 2009-11-05 Mitac International Corp. Nagivation Provision System and Framework for Providing Content to an End User
US20090321512A1 (en) * 2006-08-25 2009-12-31 Huebler Arved Navigation device
US20100094463A1 (en) * 2007-06-15 2010-04-15 Fujitsu Limited Robot
US20100168999A1 (en) * 2008-12-26 2010-07-01 Fujitsu Limited Computer readable medium for storing information display program, information display apparatus and information display method
US20100268413A1 (en) * 2007-11-29 2010-10-21 Airbus Operations Gmbh Electronic technical logbook
US7925540B1 (en) * 2004-10-15 2011-04-12 Rearden Commerce, Inc. Method and system for an automated trip planner
US20110130958A1 (en) * 2009-11-30 2011-06-02 Apple Inc. Dynamic alerts for calendar events
US20110179390A1 (en) * 2010-01-18 2011-07-21 Robert Paul Morris Methods, systems, and computer program products for traversing nodes in path on a display device
US20120259539A1 (en) * 2009-12-28 2012-10-11 Clarion Co., Ltd. Navigation Device and Guiding Method Thereof
US20120323482A1 (en) * 2011-06-16 2012-12-20 Mitac Research (Shanghai) Ltd. Program-storing computer-readable storage medium, computer program product, navigation device and control method thereof
US8346237B2 (en) 2008-09-18 2013-01-01 Apple Inc. Communications device having a commute time function and methods of use thereof
US20140100779A1 (en) * 2004-11-16 2014-04-10 Microsoft Corporation Methods for automated and semiautomated composition of visual sequences, flows, and flyovers based on content and context
US8712685B2 (en) 2012-01-26 2014-04-29 Fuji Xerox Co., Ltd. Information processing apparatus, non-transitory computer-readable recording medium, and information processing method
US20140206397A1 (en) * 2007-07-12 2014-07-24 Yahoo! Inc. Mobile notification system
US20140330516A1 (en) * 2013-03-28 2014-11-06 Linkedln Corporation Navigating with a camera device
US9268474B2 (en) 2011-01-13 2016-02-23 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium to control display of a map
JP2017010285A (en) * 2015-06-22 2017-01-12 オカムラ印刷株式会社 Information processing system, information processing program, information processing apparatus, and control program thereof
JP2017010284A (en) * 2015-06-22 2017-01-12 オカムラ印刷株式会社 Information processing system, information processing device, information processing program, portable terminal device, and its control program
US20170138753A1 (en) * 2004-12-31 2017-05-18 Google Inc. Transportation Routing
US20190145788A1 (en) * 2016-05-04 2019-05-16 Tomtom Navigation B.V. Methods and Systems for Determining Safe Return Range
US10571577B2 (en) * 2004-01-16 2020-02-25 Adidas Ag Systems and methods for presenting route traversal information
US10643134B2 (en) * 2012-08-16 2020-05-05 Samsung Electronics Co., Ltd. Schedule management method, schedule management server, and mobile terminal using the method
US10943688B2 (en) 2001-02-20 2021-03-09 Adidas Ag Performance monitoring systems and methods
CN114253261A (en) * 2021-12-08 2022-03-29 广州极飞科技股份有限公司 Path generation method, job control method and related device
US20220121845A1 (en) * 2020-10-20 2022-04-21 Aero Record Technologies Inc. Systems and methods for dynamic digitization and extraction of aviation-related data
US11392992B2 (en) * 2012-11-30 2022-07-19 Panasonic Intellectual Property Corporation Of America Information providing method

Families Citing this family (262)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10573093B2 (en) * 1995-06-07 2020-02-25 Automotive Technologies International, Inc. Vehicle computer design and use techniques for receiving navigation software
EP1072987A1 (en) * 1999-07-29 2001-01-31 International Business Machines Corporation Geographic web browser and iconic hyperlink cartography
US6477542B1 (en) * 2000-07-27 2002-11-05 Dimitrios Papaioannou Device for location-dependent automatic delivery of information with integrated custom print-on-demand
US6501391B1 (en) 1999-09-28 2002-12-31 Robert Vincent Racunas, Jr. Internet communication of parking lot occupancy
US6946974B1 (en) 1999-09-28 2005-09-20 Racunas Jr Robert Vincent Web-based systems and methods for internet communication of substantially real-time parking data
US6651241B1 (en) * 1999-09-29 2003-11-18 Lucent Technologies Inc. Scriptor and interpreter
US7142196B1 (en) * 1999-10-12 2006-11-28 Autodesk, Inc. Geographical data markup on a personal digital assistant (PDA)
US6700590B1 (en) * 1999-11-01 2004-03-02 Indx Software Corporation System and method for retrieving and presenting data using class-based component and view model
DE60137660D1 (en) * 2000-03-17 2009-04-02 Panasonic Corp Map display and navigation device
US7142205B2 (en) 2000-03-29 2006-11-28 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US7743074B1 (en) 2000-04-05 2010-06-22 Microsoft Corporation Context aware systems and methods utilizing hierarchical tree structures
US7096029B1 (en) 2000-04-05 2006-08-22 Microsoft Corporation Context aware computing devices having a common interface and related methods
JP2001297136A (en) * 2000-04-13 2001-10-26 Nec Corp Travel information distribution system
EP1152220B1 (en) * 2000-05-04 2008-07-09 Continental Automotive GmbH Navigation route planning system
US20020073000A1 (en) * 2000-05-05 2002-06-13 Mike Sage System and method for implementing a wireless network in a service center for generating a repair order
EP1172741A3 (en) * 2000-07-13 2004-09-01 Sony Corporation On-demand image delivery server, image resource database, client terminal, and method of displaying retrieval result
US20030078729A1 (en) * 2000-08-04 2003-04-24 Eriko Ohdachi Route guide information generator, route guide information generating method, and navigation system
JP2002056024A (en) * 2000-08-11 2002-02-20 Kankyo Kagaku Kk Method and system for providing travel destination information
JP2002054940A (en) * 2000-08-11 2002-02-20 Alpine Electronics Inc Travel plan assisting system and information storing medium readable by computer
US6600982B1 (en) * 2000-08-23 2003-07-29 International Business Machines Corporation System, method and article of manufacture to provide output according to trip information
JP3847539B2 (en) * 2000-08-29 2006-11-22 株式会社日立製作所 Image information processing device
JP4118006B2 (en) * 2000-09-01 2008-07-16 トヨタ自動車株式会社 Information provision system
DE10043797A1 (en) * 2000-09-06 2002-03-28 Daimler Chrysler Ag Integrated traffic monitoring system
DE10044123A1 (en) * 2000-09-07 2002-03-21 Bosch Gmbh Robert Device for generating and / or processing navigation information
JP3395768B2 (en) * 2000-09-11 2003-04-14 日本電気株式会社 Time management system and method
US6941220B2 (en) * 2000-09-12 2005-09-06 Center Comm Corporation Apparatus and method for vehicle navigation
JP3730110B2 (en) * 2000-09-22 2005-12-21 セイコーエプソン株式会社 Travel information providing method and travel information providing system
US6700534B2 (en) * 2000-10-16 2004-03-02 Scott C. Harris Position privacy in an electronic device
JP3771437B2 (en) * 2000-10-27 2006-04-26 日本電信電話株式会社 Information exchange method and information exchange system
FR2816139B1 (en) * 2000-11-02 2004-04-16 Medium Sl GLOBAL LOCATION MESSAGE TRANSMISSION SYSTEM
JP2002140619A (en) * 2000-11-06 2002-05-17 Nippon Telegr & Teleph Corp <Ntt> Information providing method in railroad/bus line information providing system, and railroad/bus line information providing system
US7987186B1 (en) * 2000-11-06 2011-07-26 Navteq North America, Llc Method and system for wavelet-based representation and use of cartographic data
JP3541035B2 (en) * 2000-11-08 2004-07-07 松下電器産業株式会社 system
JP3722221B2 (en) * 2000-11-08 2005-11-30 松下電器産業株式会社 apparatus
JP3499851B2 (en) * 2000-11-08 2004-02-23 松下電器産業株式会社 Navigation system, navigation method, and medium
CN1862230B (en) * 2000-11-08 2011-07-20 松下电器产业株式会社 Navigation display system
JP2002149528A (en) 2000-11-13 2002-05-24 Sharp Corp Information providing system, server used for the system information providing method, and machine readable recording medium for realizing the method
JP4578668B2 (en) * 2000-11-14 2010-11-10 株式会社ジェイアール総研情報システム Data communication method and communication management server
JP2002157306A (en) * 2000-11-20 2002-05-31 Nec Software Chubu Ltd Travel information selling system and travel information selling machine and its selling method and recording medium with its method recorded
JP2002168645A (en) * 2000-11-29 2002-06-14 Sharp Corp Navigation apparatus, and communication base station and system and method for navigation using them
DE50112874D1 (en) * 2000-12-02 2007-09-27 Volkswagen Ag METHOD AND DEVICE FOR PRESENTING INFORMATION
JP4552079B2 (en) * 2000-12-19 2010-09-29 ソフトバンク株式会社 Information distribution service system linked to personal calendar
US7493565B2 (en) 2000-12-22 2009-02-17 Microsoft Corporation Environment-interactive context-aware devices and methods
US7072956B2 (en) 2000-12-22 2006-07-04 Microsoft Corporation Methods and systems for context-aware policy determination and enforcement
US6944679B2 (en) * 2000-12-22 2005-09-13 Microsoft Corp. Context-aware systems and methods, location-aware systems and methods, context-aware vehicles and methods of operating the same, and location-aware vehicles and methods of operating the same
JP2002203295A (en) * 2000-12-28 2002-07-19 Toyota Motor Corp System, device and method for reserving travel
US6581000B2 (en) * 2001-01-04 2003-06-17 Carnegie Mellon University Position location system and method
JP2002230022A (en) * 2001-01-31 2002-08-16 Mitsubishi Electric Corp Information providing method and portable information terminal
JP3726748B2 (en) * 2001-02-26 2005-12-14 日本電気株式会社 Mobile marketing method, system, server, user terminal, analysis terminal, and program
US20020161756A1 (en) * 2001-02-28 2002-10-31 Fesq William Mcbride System and method for performing local searhces across user defined events
JP2002268976A (en) * 2001-03-09 2002-09-20 Nippon Telegr & Teleph Corp <Ntt> Advertising information providing method using traveling object advertising information providing system
US7072908B2 (en) 2001-03-26 2006-07-04 Microsoft Corporation Methods and systems for synchronizing visualizations with audio streams
JP4581280B2 (en) * 2001-03-29 2010-11-17 ソニー株式会社 Reception device and method, transmission device and method, communication system, recording medium, and program
US20020142268A1 (en) * 2001-03-29 2002-10-03 International Business Machines Corporation Method to transmit astrological information on a locator enabled mobile device
JP2002297509A (en) * 2001-03-29 2002-10-11 Sony Corp System and method, recording medium, and program
JP4878693B2 (en) * 2001-04-11 2012-02-15 アイシン・エィ・ダブリュ株式会社 Mobile communication device, route guidance information distribution method and program
JP2002318744A (en) * 2001-04-23 2002-10-31 Sony Corp System and method for providing information, portable terminal equipment, local server device and storage medium
JP2002337689A (en) * 2001-05-18 2002-11-27 Nec Corp Traffic guiding method, and traffic guiding system and program
JP3980844B2 (en) * 2001-05-31 2007-09-26 富士通株式会社 GUIDANCE INFORMATION REQUEST DEVICE AND GUIDANCE INFORMATION PROVIDING DEVICE
FR2825500A1 (en) * 2001-06-01 2002-12-06 Renault Method for guiding a vehicle, comprises on board unit and fixed server and a number of data exchange strategies which are selected according to factors such as server load and communication coverage
FR2825501A1 (en) * 2001-06-01 2002-12-06 Renault Method for guiding a vehicle, comprises on board vehicle unit and fixed server which uses position data from vehicle and cartographic database for calculation of an itinerary using minimum data
GB2376283B (en) * 2001-06-04 2005-03-16 Hewlett Packard Co Foot activated user input
US6558164B2 (en) * 2001-06-04 2003-05-06 Hewlett-Packard Development Company, L.P. Method and system for simulating travel
JP2002360937A (en) * 2001-06-08 2002-12-17 Konami Computer Entertainment Osaka:Kk Data delivery system, data delivery server, and video game device
US20030016744A1 (en) * 2001-07-04 2003-01-23 Canon Kabushiki Kaisha Data processing device, data processing method, computer readable recording medium, and data processing program to prevent illegal reproduction of information data
DE10134055C1 (en) * 2001-07-13 2003-04-24 Eads Deutschland Gmbh Vehicle guidance system and method for performing automatic vehicle guidance
JP2003141017A (en) * 2001-07-16 2003-05-16 Matsushita Electric Ind Co Ltd Method for providing contents delivery service and terminal device
US6604047B1 (en) * 2001-08-03 2003-08-05 Scott C. Harris Non real time traffic system for a navigator
EP1439477A1 (en) * 2001-08-10 2004-07-21 Matsushita Electric Industrial Co., Ltd. Electronic device
JP2003057045A (en) * 2001-08-15 2003-02-26 Sony Corp Method, apparatus and system for display of map
JP2003067721A (en) * 2001-08-24 2003-03-07 Pioneer Electronic Corp Map image display system and method
JP4577298B2 (en) * 2001-08-31 2010-11-10 アイシン・エィ・ダブリュ株式会社 Information display system
JPWO2003021189A1 (en) 2001-08-31 2004-12-16 アイシン・エィ・ダブリュ株式会社 Information display system
JP2003077092A (en) 2001-09-03 2003-03-14 Hitachi Ltd Remote control unit, on-vehicle unit, and remote control method
JP3841401B2 (en) * 2001-09-26 2006-11-01 株式会社東芝 Campus guidance device, server device, and program
JP4705291B2 (en) * 2001-09-28 2011-06-22 クラリオン株式会社 Navigation device, server device, navigation method, and navigation software
JP2003106862A (en) 2001-09-28 2003-04-09 Pioneer Electronic Corp Map plotting apparatus
US6989770B1 (en) 2001-10-03 2006-01-24 Navteq North America, Llc Navigation system that supports multiple languages and formats
JP3642517B2 (en) * 2001-10-05 2005-04-27 国新産業株式会社 Mobile simulation experience device
JP2003130660A (en) * 2001-10-23 2003-05-08 Alpine Electronics Inc Facility information providing device
JP2003130672A (en) * 2001-10-25 2003-05-08 Aisin Aw Co Ltd Information displaying system
KR20040047736A (en) * 2001-10-25 2004-06-05 아이신에이더블류 가부시키가이샤 Information display system
JP2003216640A (en) * 2001-11-19 2003-07-31 Matsushita Electric Ind Co Ltd Data processing apparatus and method
JP2003162217A (en) * 2001-11-26 2003-06-06 Nec Corp Map information display system, portable radio terminal and server
JP3702843B2 (en) * 2001-12-17 2005-10-05 日産自動車株式会社 In-vehicle information playback device
JP3705197B2 (en) * 2001-12-17 2005-10-12 日産自動車株式会社 Vehicle information providing device
JP3933929B2 (en) * 2001-12-28 2007-06-20 アルパイン株式会社 Navigation device
JP2003240567A (en) * 2002-02-13 2003-08-27 Mitsubishi Electric Corp Navigation apparatus and method therefor
US8249880B2 (en) * 2002-02-14 2012-08-21 Intellisist, Inc. Real-time display of system instructions
US7146179B2 (en) * 2002-03-26 2006-12-05 Parulski Kenneth A Portable imaging device employing geographic information to facilitate image access and viewing
TW575844B (en) * 2002-05-14 2004-02-11 Via Tech Inc Group virtual reality touring system and operation method
JP2004048674A (en) * 2002-05-24 2004-02-12 Olympus Corp Information presentation system of visual field agreement type, portable information terminal, and server
JP2003157489A (en) * 2002-06-03 2003-05-30 Equos Research Co Ltd Operation control device
US7289138B2 (en) * 2002-07-02 2007-10-30 Fuji Xerox Co., Ltd. Intersection detection in panoramic video
US20040006926A1 (en) * 2002-07-15 2004-01-15 Neeley Clifton B. Climate controlled practice facility and method utilizing the same
JP4092976B2 (en) * 2002-08-05 2008-05-28 ソニー株式会社 Guide system, content server, and information processing program
JPWO2004019225A1 (en) 2002-08-26 2005-12-15 富士通株式会社 Apparatus and method for processing status information
JP2004096621A (en) 2002-09-03 2004-03-25 Fujitsu Ltd Information distribution service system based on prediction of positional change of mobile information terminal
US6889147B2 (en) * 2002-09-17 2005-05-03 Hydrogenics Corporation System, computer program product and method for controlling a fuel cell testing device
US6978224B2 (en) * 2002-09-17 2005-12-20 Hydrogenics Corporation Alarm recovery system and method for fuel cell testing systems
US7707140B2 (en) * 2002-10-09 2010-04-27 Yahoo! Inc. Information retrieval system and method employing spatially selective features
US7835858B2 (en) * 2002-11-22 2010-11-16 Traffic.Com, Inc. Method of creating a virtual traffic network
JPWO2004061392A1 (en) * 2002-12-27 2006-05-18 富士通株式会社 Action support method and apparatus
JP4050983B2 (en) * 2002-12-27 2008-02-20 有限会社ヴェルク・ジャパン Portable walking navigation device
US7254481B2 (en) 2002-12-27 2007-08-07 Fujitsu Limited Action support method and apparatus
JP2004212295A (en) * 2003-01-07 2004-07-29 Mitsubishi Electric Corp Navigation system
US7437675B2 (en) * 2003-02-03 2008-10-14 Hewlett-Packard Development Company, L.P. System and method for monitoring event based systems
ES2288678T3 (en) * 2003-02-26 2008-01-16 Tomtom International B.V. NAVIGATION DEVICE AND METHOD TO SHOW ALTERNATIVE ROUTES.
JP3711986B2 (en) 2003-03-20 2005-11-02 オムロン株式会社 Information output apparatus and method, recording medium, and program
JP4032355B2 (en) * 2003-03-27 2008-01-16 カシオ計算機株式会社 Display processing apparatus, display control method, and display processing program
JP2004295625A (en) 2003-03-27 2004-10-21 Fujitsu Ltd Area information providing system, and area information providing program
US7415243B2 (en) 2003-03-27 2008-08-19 Honda Giken Kogyo Kabushiki Kaisha System, method and computer program product for receiving data from a satellite radio network
AU2003902042A0 (en) * 2003-04-30 2003-05-15 Nextspace Technologies Pty Ltd Delivery and/or collection optimization system & method
US20040229954A1 (en) * 2003-05-16 2004-11-18 Macdougall Diane Elaine Selective manipulation of triglyceride, HDL and LDL parameters with 6-(5-carboxy-5-methyl-hexyloxy)-2,2-dimethylhexanoic acid monocalcium salt
JP2005001533A (en) * 2003-06-12 2005-01-06 Denso Corp On-vehicle e-mail arrival notifying device and e-mail transmitting device
JP2005070658A (en) * 2003-08-27 2005-03-17 Hyojito Kk Map information information providing device, signboard mapping system, signboard map, map data conversion method, and map data conversion program
WO2005028713A1 (en) * 2003-09-22 2005-03-31 Hydrogenics Corporation Electrolyzer cell stack system
US7130741B2 (en) * 2003-10-23 2006-10-31 International Business Machines Corporation Navigating a UAV with a remote control device
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
US7107148B1 (en) * 2003-10-23 2006-09-12 International Business Machines Corporation Navigating a UAV with on-board navigation algorithms with flight depiction
US7590553B2 (en) * 2003-10-27 2009-09-15 Microsoft Corporation Integrated spatial view of time, location, and event schedule information
US20050114014A1 (en) * 2003-11-24 2005-05-26 Isaac Emad S. System and method to notify a person of a traveler's estimated time of arrival
US20050114020A1 (en) * 2003-11-25 2005-05-26 Nissan Motor Co., Ltd. Navigation device, car navigation program, display device, and display control program for presenting information on branch destination
US7698292B2 (en) * 2003-12-03 2010-04-13 Siemens Aktiengesellschaft Tag management within a decision, support, and reporting environment
US20050144015A1 (en) * 2003-12-08 2005-06-30 International Business Machines Corporation Automatic identification of optimal audio segments for speech applications
US7818380B2 (en) 2003-12-15 2010-10-19 Honda Motor Co., Ltd. Method and system for broadcasting safety messages to a vehicle
US8041779B2 (en) 2003-12-15 2011-10-18 Honda Motor Co., Ltd. Method and system for facilitating the exchange of information between a vehicle and a remote location
JP2005202788A (en) * 2004-01-16 2005-07-28 National Institute Of Advanced Industrial & Technology Method, device and program for searching for space, and computer- readable recording medium for recording this program
US20050195096A1 (en) * 2004-03-05 2005-09-08 Ward Derek K. Rapid mobility analysis and vehicular route planning from overhead imagery
US7814153B2 (en) * 2004-03-12 2010-10-12 Prototerra, Inc. System and method for client side managed data prioritization and connections
WO2005099379A2 (en) 2004-04-06 2005-10-27 Honda Motor Co., Ltd. Method and system for controlling the exchange of vehicle related messages
US7069147B2 (en) * 2004-05-28 2006-06-27 Honeywell International Inc. Airborne based monitoring
US6937937B1 (en) 2004-05-28 2005-08-30 Honeywell International Inc. Airborne based monitoring
JP2006012081A (en) * 2004-06-29 2006-01-12 Kenwood Corp Content output device, navigation device, content output program and content output method
US20060238378A1 (en) * 2004-07-29 2006-10-26 Eriko Ohdachi Communication type map display device
US7840607B2 (en) * 2004-08-06 2010-11-23 Siemens Aktiengesellschaft Data mart generation and use in association with an operations intelligence platform
US8700671B2 (en) * 2004-08-18 2014-04-15 Siemens Aktiengesellschaft System and methods for dynamic generation of point / tag configurations
JP2006072642A (en) * 2004-09-01 2006-03-16 Noritsu Koki Co Ltd Tourism information guide apparatus
US8117073B1 (en) 2004-09-17 2012-02-14 Rearden Commerce, Inc. Method and system for delegation of travel arrangements by a temporary agent
US7643788B2 (en) 2004-09-22 2010-01-05 Honda Motor Co., Ltd. Method and system for broadcasting data messages to a vehicle
US7480567B2 (en) * 2004-09-24 2009-01-20 Nokia Corporation Displaying a map having a close known location
US7962381B2 (en) * 2004-10-15 2011-06-14 Rearden Commerce, Inc. Service designer solution
JP4282591B2 (en) * 2004-11-30 2009-06-24 株式会社東芝 Schedule management apparatus, schedule management method, and program
US7814123B2 (en) * 2004-12-02 2010-10-12 Siemens Aktiengesellschaft Management of component members using tag attributes
US20060129313A1 (en) * 2004-12-14 2006-06-15 Becker Craig H System and method for driving directions based on non-map criteria
US7970666B1 (en) 2004-12-30 2011-06-28 Rearden Commerce, Inc. Aggregate collection of travel data
US8442938B2 (en) * 2005-01-14 2013-05-14 Siemens Aktiengesellschaft Child data structure update in data management system
US20080147450A1 (en) * 2006-10-16 2008-06-19 William Charles Mortimore System and method for contextualized, interactive maps for finding and booking services
JP4632824B2 (en) * 2005-03-18 2011-02-16 Necカシオモバイルコミュニケーションズ株式会社 Portable terminal device and program
US20060218116A1 (en) * 2005-03-28 2006-09-28 O'hearn James E Pass-through interface queries to populate a class-based model
US7353034B2 (en) 2005-04-04 2008-04-01 X One, Inc. Location sharing and tracking using mobile phones or other wireless devices
US20060243788A1 (en) * 2005-04-04 2006-11-02 David Waco Method and apparatus for wireless PC tablet presentation process
US20060230056A1 (en) * 2005-04-06 2006-10-12 Nokia Corporation Method and a device for visual management of metadata
US7933395B1 (en) 2005-06-27 2011-04-26 Google Inc. Virtual tour of user-defined paths in a geographic information system
US7353114B1 (en) * 2005-06-27 2008-04-01 Google Inc. Markup language for an interactive geographic information system
US7706808B1 (en) 2005-07-07 2010-04-27 Rearden Commerce, Inc. One-click service status tracking and updates
US7742954B1 (en) 2005-07-07 2010-06-22 Rearden Commerce, Inc. Method and system for an enhanced portal for services suppliers
US8160614B2 (en) * 2005-08-05 2012-04-17 Targus Information Corporation Automated concierge system and method
EP1934801B1 (en) * 2005-08-16 2019-07-03 LG Electronics Inc. Terminal and method for supporting dynamic contents delivery service
US20070050128A1 (en) * 2005-08-31 2007-03-01 Garmin Ltd., A Cayman Islands Corporation Method and system for off-board navigation with a portable device
JP4642607B2 (en) * 2005-08-31 2011-03-02 有限会社ヴェルク・ジャパン Audio / image playback method using mobile playback device
JP2007093334A (en) * 2005-09-28 2007-04-12 Zenrin Co Ltd Route guide system
WO2007036926A2 (en) * 2005-09-28 2007-04-05 Shmuel Weber Control-information system for mass transportation vehicles
US7933897B2 (en) 2005-10-12 2011-04-26 Google Inc. Entity display priority in a distributed geographic information system
US8046162B2 (en) 2005-11-04 2011-10-25 Honda Motor Co., Ltd. Data broadcast method for traffic information
US20070124306A1 (en) * 2005-11-09 2007-05-31 Honda Motor Co., Ltd. Method and system for transmitting data to vehicles over limited data links
DE102005056047A1 (en) * 2005-11-24 2007-06-06 Siemens Ag Vehicle Information System
US9117223B1 (en) 2005-12-28 2015-08-25 Deem, Inc. Method and system for resource planning for service provider
JP2007226370A (en) * 2006-02-22 2007-09-06 Fujitsu Ltd Information disclosure control method and device, and information disclosure control instruction method
US7707516B2 (en) * 2006-05-26 2010-04-27 Google Inc. Embedded navigation interface
WO2007146967A2 (en) * 2006-06-12 2007-12-21 Google Inc. Markup language for interactive geographic information system
US7941374B2 (en) * 2006-06-30 2011-05-10 Rearden Commerce, Inc. System and method for changing a personal profile or context during a transaction
US20080004919A1 (en) * 2006-06-30 2008-01-03 Rearden Commerce, Inc. Triggered transactions based on criteria
US8073719B2 (en) * 2006-06-30 2011-12-06 Rearden Commerce, Inc. System and method for core identity with personas across multiple domains with permissions on profile data based on rights of domain
US20080004980A1 (en) * 2006-06-30 2008-01-03 Rearden Commerce, Inc. System and method for regulating supplier acceptance of service requests
US20080004917A1 (en) * 2006-06-30 2008-01-03 Rearden Commerce, Inc. System and method for automatically rebooking reservations
EP1879001B1 (en) * 2006-07-10 2016-04-27 Harman Becker Automotive Systems GmbH Format description for a navigation database
US8095402B2 (en) * 2006-07-10 2012-01-10 Rearden Commerce, Inc. System and method for transferring a service policy between domains
JP2008020980A (en) * 2006-07-11 2008-01-31 Kenwood Corp Agent device, program, and proposal method in agent device
NZ549547A (en) * 2006-08-31 2008-12-24 Trekwizard Ltd Methods and apparatus of grading a route and calculating travel time information for the route
CN101216823A (en) * 2007-01-04 2008-07-09 阿里巴巴公司 Website navigation system and website navigation method
US20080201432A1 (en) * 2007-02-16 2008-08-21 Rearden Commerce, Inc. System and Method for Facilitating Transfer of Experience Data in to Generate a New Member Profile for a Online Service Portal
US8260783B2 (en) * 2007-02-27 2012-09-04 Siemens Aktiengesellschaft Storage of multiple, related time-series data streams
EP1965172B1 (en) * 2007-03-02 2010-11-17 Alpine Electronics, Inc. Information display system and method for displaying information associated with map related data
US8584013B1 (en) * 2007-03-20 2013-11-12 Google Inc. Temporal layers for presenting personalization markers on imagery
US8024110B2 (en) 2007-05-22 2011-09-20 Xanavi Informatics Corporation Method of estimation of traffic information, device of estimation of traffic information and car navigation device
US7668653B2 (en) 2007-05-31 2010-02-23 Honda Motor Co., Ltd. System and method for selectively filtering and providing event program information
JP4513829B2 (en) * 2007-06-07 2010-07-28 株式会社デンソー Information processing apparatus for moving body and car navigation apparatus
US8302033B2 (en) * 2007-06-22 2012-10-30 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
US20090006143A1 (en) * 2007-06-26 2009-01-01 Rearden Commerce, Inc. System and Method for Interactive Natural Language Rebooking or Rescheduling of Calendar Activities
WO2009014081A1 (en) * 2007-07-23 2009-01-29 Clarion Co., Ltd. Navigation device
DE102007042038A1 (en) * 2007-09-05 2009-03-12 Navigon Gmbh Navigation device and method for operating a navigation device
JP4618284B2 (en) * 2007-09-10 2011-01-26 ソニー株式会社 Portable device and information processing program
US8099308B2 (en) 2007-10-02 2012-01-17 Honda Motor Co., Ltd. Method and system for vehicle service appointments based on diagnostic trouble codes
FR2922347B1 (en) * 2007-10-11 2010-10-15 Bouchaib Hoummady METHOD AND DEVICE FOR DYNAMICALLY MANAGING GUIDANCE AND MOBILITY IN TRAFFIC BY TAKING INTO ACCOUNT GROUND SPACE OCCUPANCY BY VEHICLES
JP4584297B2 (en) * 2007-11-09 2010-11-17 株式会社エヌ・ティ・ティ・ドコモ GUIDANCE INFORMATION PROVIDING DEVICE AND GUIDANCE INFORMATION PROVIDING METHOD
JP2009128182A (en) * 2007-11-22 2009-06-11 Pioneer Electronic Corp Information presentation device
US20090177513A1 (en) * 2008-01-04 2009-07-09 Colin John Eckhart Device and Method for Dynamic Itinerary Planning and Tracking for Mobile Communications Device
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US20090210261A1 (en) * 2008-02-20 2009-08-20 Rearden Commerce, Inc. System and Method for Multi-Modal Travel Shopping
US8275394B2 (en) * 2008-03-20 2012-09-25 Nokia Corporation Nokia places floating profile
US20090248457A1 (en) * 2008-03-31 2009-10-01 Rearden Commerce, Inc. System and Method for Providing Travel Schedule of Contacts
US8370111B2 (en) * 2008-03-31 2013-02-05 The Boeing Company System and method for forming optimized perimeter surveillance
US8686854B2 (en) * 2008-03-31 2014-04-01 The Boeing Company System and method for forming optimized perimeter surveillance
US8082494B2 (en) * 2008-04-18 2011-12-20 Microsoft Corporation Rendering markup language macro data for display in a graphical user interface
JP5448372B2 (en) * 2008-05-26 2014-03-19 日本電信電話株式会社 Selective information presentation device and selective information presentation processing program
US20090306882A1 (en) * 2008-06-10 2009-12-10 Yahoo! Inc. System, apparatus, or method for enhanced route directions
US8335608B2 (en) * 2008-06-11 2012-12-18 The Boeing Company Monitoring vehicle and equipment operations at an airport
KR101481443B1 (en) * 2008-09-12 2015-01-12 삼성전자주식회사 A method for management device in a communication network and a system thereof
US20100121564A1 (en) * 2008-10-31 2010-05-13 Hitachi Automotive Systems, Ltd. Remote guide system, remote guide method and remote guide device
JP4591595B2 (en) * 2008-11-25 2010-12-01 トヨタ自動車株式会社 Destination user number prediction system and center
JP4835684B2 (en) * 2008-12-17 2011-12-14 株式会社デンソー Information providing system and in-vehicle device
US8311556B2 (en) 2009-01-22 2012-11-13 Htc Corporation Method and system for managing images and geographic location data in a mobile device
US20100198882A1 (en) * 2009-02-01 2010-08-05 Shrestha Roshan B Systems and methods for creating multiple logbooks in a computer application.
US20100211419A1 (en) * 2009-02-13 2010-08-19 Rearden Commerce, Inc. Systems and Methods to Present Travel Options
JP5391775B2 (en) * 2009-03-27 2014-01-15 ソニー株式会社 Digital cinema management apparatus and digital cinema management method
WO2010119182A1 (en) * 2009-04-14 2010-10-21 Bouchaib Hoummady Method and device for dynamically managing guiding and mobility in traffic
US10552849B2 (en) 2009-04-30 2020-02-04 Deem, Inc. System and method for offering, tracking and promoting loyalty rewards
KR101638135B1 (en) * 2009-05-12 2016-07-20 팅크웨어(주) Navigation device, navigation system, and operating method thereof
JP5492694B2 (en) * 2009-07-31 2014-05-14 クラリオン株式会社 Navigation device, program, and display method
US8175794B1 (en) * 2009-10-06 2012-05-08 Google Inc. Switching between best views of a place
US8862576B2 (en) 2010-01-06 2014-10-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
US8627203B2 (en) 2010-02-25 2014-01-07 Adobe Systems Incorporated Method and apparatus for capturing, analyzing, and converting scripts
JP5126272B2 (en) * 2010-03-31 2013-01-23 株式会社デンソー Navigation system
JP5516124B2 (en) * 2010-06-22 2014-06-11 株式会社デンソーアイティーラボラトリ Content reproduction system and reproduction method according to position information
EP2416122A3 (en) * 2010-08-02 2013-11-20 Navigon AG Method for operating a navigation system
CN102376304B (en) * 2010-08-10 2014-04-30 鸿富锦精密工业(深圳)有限公司 Text reading system and text reading method thereof
JP2012073061A (en) * 2010-09-28 2012-04-12 Canvas Mapple Co Ltd Navigation device, navigation program, and center system
US9449288B2 (en) 2011-05-20 2016-09-20 Deem, Inc. Travel services search
JP5874225B2 (en) * 2011-07-20 2016-03-02 アイシン・エィ・ダブリュ株式会社 Movement guidance system, movement guidance apparatus, movement guidance method, and computer program
US9267806B2 (en) * 2011-08-29 2016-02-23 Bayerische Motoren Werke Aktiengesellschaft System and method for automatically receiving geo-relevant information in a vehicle
US9171384B2 (en) * 2011-11-08 2015-10-27 Qualcomm Incorporated Hands-free augmented reality for wireless communication devices
JP5803645B2 (en) * 2011-12-15 2015-11-04 アイシン・エィ・ダブリュ株式会社 Evaluation display system, method and program
US8930813B2 (en) * 2012-04-03 2015-01-06 Orlando McMaster Dynamic text entry/input system
CN104411558B (en) * 2012-07-06 2017-09-22 丰田自动车株式会社 The travel controlling system of vehicle
JP6005474B2 (en) * 2012-10-30 2016-10-12 株式会社 ミックウェア Information processing system, information processing terminal, information processing method, and program
CN103471610B (en) * 2013-09-24 2016-05-25 沈阳美行科技有限公司 A kind ofly support online, the double mode air navigation aid of off-line
US9733095B2 (en) * 2013-10-07 2017-08-15 Telenav, Inc. Navigation system with guidance delivery mechanism and method of operation thereof
US20160111007A1 (en) 2013-10-21 2016-04-21 Rhett Rodney Dennerline Database System To Organize Selectable Items For Users Related to Route Planning
JP5972301B2 (en) * 2014-02-20 2016-08-17 本田技研工業株式会社 Visit plan creation system, terminal device, and visit plan creation method
JP6465376B2 (en) * 2014-06-16 2019-02-06 株式会社インタラクティブソリューションズ Display information management system
KR102300034B1 (en) * 2014-07-04 2021-09-08 엘지전자 주식회사 Digital image processing apparatus and controlling method thereof
US10051412B2 (en) 2015-02-25 2018-08-14 Ricoh Company, Ltd. Locational information transmission system, locational information transmission apparatus, and information processing device
US20170286575A1 (en) 2016-03-31 2017-10-05 Cae Inc. Method and systems for anticipatorily updating a remote repository
US10115320B2 (en) 2016-03-31 2018-10-30 Cae Inc. Method and systems for updating a remote repository based on data-types
US9734184B1 (en) 2016-03-31 2017-08-15 Cae Inc. Method and systems for removing the most extraneous data record from a remote repository
DE112017002041T5 (en) * 2016-04-14 2019-01-10 Sony Corporation Information processing apparatus, information processing method and mobile apparatus
JP6692209B2 (en) * 2016-05-11 2020-05-13 株式会社日立製作所 Parking management system and control method thereof
JP6663824B2 (en) * 2016-08-19 2020-03-13 株式会社 ミックウェア Navigation system and computer program
JP2018041285A (en) * 2016-09-07 2018-03-15 富士通株式会社 Schedule management program, schedule management method, and schedule management device
US10884162B2 (en) 2017-01-11 2021-01-05 Weathervane Labs, Llc Determining personal outdoor comfort with individual and environmental parameters
US10692023B2 (en) * 2017-05-12 2020-06-23 International Business Machines Corporation Personal travel assistance system and method for traveling through a transport hub
US11182950B2 (en) 2017-05-23 2021-11-23 Sony Corporation Information processing device and information processing method
US11693888B1 (en) * 2018-07-12 2023-07-04 Intuit, Inc. Intelligent grouping of travel data for review through a user interface
US11610498B2 (en) 2018-11-28 2023-03-21 Kyndryl, Inc. Voice interactive portable computing device for learning about places of interest
CN113203423B (en) * 2019-09-29 2024-02-02 百度在线网络技术(北京)有限公司 Map navigation simulation method and device
US11514314B2 (en) 2019-11-25 2022-11-29 International Business Machines Corporation Modeling environment noise for training neural networks
US11228544B2 (en) 2020-01-09 2022-01-18 International Business Machines Corporation Adapting communications according to audience profile from social media
US11692839B2 (en) 2020-05-20 2023-07-04 Here Global B.V. Methods and apparatuses for providing navigation instructions
US11748558B2 (en) * 2020-10-27 2023-09-05 Disney Enterprises, Inc. Multi-persona social agent
US11798295B2 (en) * 2021-04-27 2023-10-24 GM Global Technology Operations LLC Model free lane tracking system
WO2022260657A1 (en) * 2021-06-08 2022-12-15 Templer Lisa Boat sharing system

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159687A (en) * 1989-11-14 1992-10-27 Caseworks, Inc. Method and apparatus for generating program code files
US5272638A (en) * 1991-05-31 1993-12-21 Texas Instruments Incorporated Systems and methods for planning the scheduling travel routes
US5579535A (en) * 1991-07-01 1996-11-26 Motorola, Inc. Personal communication system providing supplemental information mode
US5442557A (en) * 1991-07-26 1995-08-15 Pioneer Electronic Corporation Navigation device
US5237499A (en) * 1991-11-12 1993-08-17 Garback Brent J Computer travel planning system
US6282489B1 (en) * 1993-05-28 2001-08-28 Mapquest.Com, Inc. Methods and apparatus for displaying a travel route and generating a list of places of interest located near the travel route
US5402120A (en) * 1993-08-18 1995-03-28 Zexel Corporation Navigation system
JP3541403B2 (en) 1993-09-16 2004-07-14 松下電器産業株式会社 Electronic map display device and navigation device
JPH07318361A (en) 1994-05-27 1995-12-08 Victor Co Of Japan Ltd Navigation apparatus
US5948040A (en) * 1994-06-24 1999-09-07 Delorme Publishing Co. Travel reservation information and planning system
EP0772856B1 (en) * 1994-07-29 1998-04-15 Seiko Communications Holding N.V. Dual channel advertising referencing vehicle location
EP0702208B1 (en) * 1994-09-08 2002-05-29 Matsushita Electric Industrial Co., Ltd. Method and system of route selection
US5784059A (en) * 1994-09-16 1998-07-21 Aisin Aw Co., Ltd. Vehicle navigation system with destination selection using hierarchical menu arrangement with selective level skipping
EP1708151A3 (en) * 1995-08-09 2007-01-24 Toyota Jidosha Kabushiki Kaisha Travel plan preparing device
JPH09153054A (en) 1995-11-29 1997-06-10 Nec Corp Information retrieval and transmitting terminal device and retrieval server
JPH09198439A (en) * 1996-01-22 1997-07-31 Toyota Motor Corp Travel plan preparation system
JPH09204475A (en) * 1996-01-24 1997-08-05 Toyota Motor Corp Travel plan generation device
JP3125669B2 (en) * 1996-01-31 2001-01-22 トヨタ自動車株式会社 Travel planning equipment
JPH1011298A (en) 1996-06-24 1998-01-16 Sony Corp Controller and controlling method
JPH1066149A (en) 1996-08-21 1998-03-06 Brother Ind Ltd Guiding information system
JP3503397B2 (en) * 1997-02-25 2004-03-02 Kddi株式会社 Map display system
JP3183209B2 (en) 1997-03-24 2001-07-09 トヨタ自動車株式会社 Communication terminal device, communication system, and storage medium storing program for controlling data processing in communication terminal
JP3220408B2 (en) * 1997-03-31 2001-10-22 富士通テン株式会社 Route guidance device
US6138072A (en) * 1997-04-24 2000-10-24 Honda Giken Kogyo Kabushiki Kaisha Navigation device
US6285932B1 (en) 1997-05-16 2001-09-04 Snap-On Technologies, Inc. Computerized automotive service system
US6091956A (en) * 1997-06-12 2000-07-18 Hollenberg; Dennis D. Situation information system
JPH1151674A (en) * 1997-08-08 1999-02-26 Aisin Aw Co Ltd Car navigation system and recording medium
JP3500928B2 (en) * 1997-09-17 2004-02-23 トヨタ自動車株式会社 Map data processing device, map data processing method, and map data processing system
US6026375A (en) * 1997-12-05 2000-02-15 Nortel Networks Corporation Method and apparatus for processing orders from customers in a mobile environment
US6121924A (en) * 1997-12-30 2000-09-19 Navigation Technologies Corporation Method and system for providing navigation systems with updated geographic data
US6711379B1 (en) * 1998-05-28 2004-03-23 Kabushiki Kaisha Toshiba Digital broadcasting system and terminal therefor
US6401034B1 (en) * 1999-09-02 2002-06-04 Navigation Technologies Corp. Method and system for finding intermediate destinations with a navigation system

Cited By (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010009427A1 (en) * 2000-01-25 2001-07-26 Kazuma Kaneko Navigation apparatus and recording medium providing communication between applications
US20020099734A1 (en) * 2000-11-29 2002-07-25 Philips Electronics North America Corp. Scalable parser for extensible mark-up language
US11557388B2 (en) 2001-02-20 2023-01-17 Adidas Ag Performance monitoring systems and methods
US10991459B2 (en) 2001-02-20 2021-04-27 Adidas Ag Performance monitoring systems and methods
US10943688B2 (en) 2001-02-20 2021-03-09 Adidas Ag Performance monitoring systems and methods
US6990408B2 (en) * 2001-03-02 2006-01-24 Kabushiki Kaisha Toshiba Image forming system and image forming apparatus
US20050043883A1 (en) * 2001-03-02 2005-02-24 Kabushiki Kaisha Toshiba Image forming system and image forming apparatus
US7529618B2 (en) 2001-03-02 2009-05-05 Kabushiki Kaisha Toshiba Image forming system and image forming apparatus
US20080201071A1 (en) * 2001-03-29 2008-08-21 Gilad Odinak Vehicle navigation system and method
US6993718B2 (en) * 2001-08-24 2006-01-31 Sony Corporation Information processing method and apparatus
US20030056175A1 (en) * 2001-08-24 2003-03-20 Masahiro Fujihara Information processing method and apparatus
US20050075786A1 (en) * 2001-12-17 2005-04-07 Susumu Sakatani Data providing service system
US20070276588A1 (en) * 2001-12-17 2007-11-29 Matsushita Electric Industrial Data-providing service system
US7398152B2 (en) 2001-12-17 2008-07-08 Matsushita Electric Industrial Data-providing service system
US7233858B2 (en) * 2001-12-17 2007-06-19 Matsushita Electrical Industrial Co., Ltd. Data providing service system
US20060074696A1 (en) * 2002-10-15 2006-04-06 Sharp Kabushiki Kaisha Information processing device, information processing method, information processing program and medium
US6845322B1 (en) * 2003-07-15 2005-01-18 Televigation, Inc. Method and system for distributed navigation
US8219925B2 (en) 2003-08-11 2012-07-10 Smith Micro Software, Inc Formatting ticker content in a handheld wireless telecommunication device
US7430724B2 (en) 2003-08-11 2008-09-30 Core Mobility, Inc. Systems and methods for displaying content in a ticker
US8458611B2 (en) 2003-08-11 2013-06-04 Smith Micro Software, Inc. Displaying a map on a handheld wireless telecommunication device
US7343564B2 (en) * 2003-08-11 2008-03-11 Core Mobility, Inc. Systems and methods for displaying location-based maps on communication devices
US7370283B2 (en) 2003-08-11 2008-05-06 Core Mobility, Inc. Systems and methods for populating a ticker using multiple data transmission modes
US20080155453A1 (en) * 2003-08-11 2008-06-26 Core Mobility, Inc. Systems and methods for displaying location-based maps on communication devices
US7747962B2 (en) 2003-08-11 2010-06-29 Core Mobility, Inc. Systems and methods for displaying location-based maps on communication devices
US20060236257A1 (en) * 2003-08-11 2006-10-19 Core Mobility, Inc. Interactive user interface presentation attributes for location-based content
US20050210391A1 (en) * 2003-08-11 2005-09-22 Core Mobility, Inc. Systems and methods for navigating content in an interactive ticker
US7747963B2 (en) 2003-08-11 2010-06-29 Core Mobility, Inc. Displaying location-based content in a ticker of a handheld mobile communication device
US7441203B2 (en) 2003-08-11 2008-10-21 Core Mobility, Inc. Interactive user interface presentation attributes for location-based content
US20050039135A1 (en) * 2003-08-11 2005-02-17 Konstantin Othmer Systems and methods for navigating content in an interactive ticker
US8219926B2 (en) 2003-08-11 2012-07-10 Smith Micro Software, Inc Displaying a map on a handheld wireless telecommunication device
US20060236258A1 (en) * 2003-08-11 2006-10-19 Core Mobility, Inc. Scheduling of rendering of location-based content
US8214738B2 (en) 2003-08-11 2012-07-03 Smith Micro Software, Inc Displaying location-based content in a handheld device
US8539371B2 (en) 2003-08-11 2013-09-17 Smith Micro Software, Inc Formatting ticker content in a handheld wireless telecommunication device
US20050039136A1 (en) * 2003-08-11 2005-02-17 Konstantin Othmer Systems and methods for displaying content in a ticker
US20070093957A1 (en) * 2003-10-23 2007-04-26 Shin Kikuchi Image data transmitting/receiving system, server, mobile phone terminal,program and recording medium
US11493637B2 (en) 2004-01-16 2022-11-08 Adidas Ag Systems and methods for providing a health coaching message
US11650325B2 (en) 2004-01-16 2023-05-16 Adidas Ag Systems and methods for providing a health coaching message
US11150354B2 (en) 2004-01-16 2021-10-19 Adidas Ag Systems and methods for modifying a fitness plan
US11119220B2 (en) 2004-01-16 2021-09-14 Adidas Ag Systems and methods for providing a health coaching message
US10571577B2 (en) * 2004-01-16 2020-02-25 Adidas Ag Systems and methods for presenting route traversal information
US7925540B1 (en) * 2004-10-15 2011-04-12 Rearden Commerce, Inc. Method and system for an automated trip planner
US7904239B2 (en) * 2004-11-15 2011-03-08 Pioneer Corporation Travel route display system
US20090082958A1 (en) * 2004-11-15 2009-03-26 Pioneer Corporation Travel Route Display System
US9243928B2 (en) 2004-11-16 2016-01-26 Microsoft Technology Licensing, Llc Methods for automated and semiautomated composition of visual sequences, flows, and flyovers based on content and context
US10184803B2 (en) 2004-11-16 2019-01-22 Microsoft Technology Licensing, Llc Methods for automated and semiautomated composition of visual sequences, flows, and flyovers based on content and context
US9267811B2 (en) * 2004-11-16 2016-02-23 Microsoft Technology Licensing, Llc Methods for automated and semiautomated composition of visual sequences, flows, and flyovers based on content and context
US20140100779A1 (en) * 2004-11-16 2014-04-10 Microsoft Corporation Methods for automated and semiautomated composition of visual sequences, flows, and flyovers based on content and context
US20170138753A1 (en) * 2004-12-31 2017-05-18 Google Inc. Transportation Routing
US9945686B2 (en) * 2004-12-31 2018-04-17 Google Llc Transportation routing
US11092455B2 (en) 2004-12-31 2021-08-17 Google Llc Transportation routing
US20060230108A1 (en) * 2005-04-07 2006-10-12 Olympus Corporation Information display system
US7996177B2 (en) 2005-04-07 2011-08-09 Olympus Corporation Information display system
US20090276318A1 (en) * 2006-04-20 2009-11-05 Mitac International Corp. Nagivation Provision System and Framework for Providing Content to an End User
US20070291034A1 (en) * 2006-06-20 2007-12-20 Dones Nelson C System for presenting a navigable virtual subway system, and method for operating and using the same
US7843469B2 (en) * 2006-07-21 2010-11-30 The Boeing Company Overlaying information onto a view for electronic display
US20080018659A1 (en) * 2006-07-21 2008-01-24 The Boeing Company Overlaying information onto a view for electronic display
US20090321512A1 (en) * 2006-08-25 2009-12-31 Huebler Arved Navigation device
US8290642B2 (en) 2007-03-02 2012-10-16 The Boeing Company Electronic flight bag having filter system and method
US20080215193A1 (en) * 2007-03-02 2008-09-04 The Boeing Company Electronic flight bag having filter system and method
US8565923B2 (en) 2007-06-15 2013-10-22 Fujitsu Limited Robot
US20100094463A1 (en) * 2007-06-15 2010-04-15 Fujitsu Limited Robot
US9175964B2 (en) * 2007-06-28 2015-11-03 Apple Inc. Integrated calendar and map applications in a mobile device
US20090006994A1 (en) * 2007-06-28 2009-01-01 Scott Forstall Integrated calendar and map applications in a mobile device
US20140206397A1 (en) * 2007-07-12 2014-07-24 Yahoo! Inc. Mobile notification system
US8521342B2 (en) * 2007-11-29 2013-08-27 Airbus Operations Gmbh Electronic technical logbook
US20100268413A1 (en) * 2007-11-29 2010-10-21 Airbus Operations Gmbh Electronic technical logbook
US9207084B2 (en) 2008-06-27 2015-12-08 Apple Inc. Dynamic alerts for calendar events
US8346237B2 (en) 2008-09-18 2013-01-01 Apple Inc. Communications device having a commute time function and methods of use thereof
US20100168999A1 (en) * 2008-12-26 2010-07-01 Fujitsu Limited Computer readable medium for storing information display program, information display apparatus and information display method
US8423288B2 (en) 2009-11-30 2013-04-16 Apple Inc. Dynamic alerts for calendar events
US8660790B2 (en) 2009-11-30 2014-02-25 Apple Inc. Dynamic alerts for calendar events
US20110130958A1 (en) * 2009-11-30 2011-06-02 Apple Inc. Dynamic alerts for calendar events
US20120259539A1 (en) * 2009-12-28 2012-10-11 Clarion Co., Ltd. Navigation Device and Guiding Method Thereof
US20110179390A1 (en) * 2010-01-18 2011-07-21 Robert Paul Morris Methods, systems, and computer program products for traversing nodes in path on a display device
US9268474B2 (en) 2011-01-13 2016-02-23 Sony Corporation Information processing apparatus, method, and non-transitory computer-readable medium to control display of a map
US20120323482A1 (en) * 2011-06-16 2012-12-20 Mitac Research (Shanghai) Ltd. Program-storing computer-readable storage medium, computer program product, navigation device and control method thereof
US8712685B2 (en) 2012-01-26 2014-04-29 Fuji Xerox Co., Ltd. Information processing apparatus, non-transitory computer-readable recording medium, and information processing method
US10643134B2 (en) * 2012-08-16 2020-05-05 Samsung Electronics Co., Ltd. Schedule management method, schedule management server, and mobile terminal using the method
US11392992B2 (en) * 2012-11-30 2022-07-19 Panasonic Intellectual Property Corporation Of America Information providing method
US10378915B2 (en) 2013-03-28 2019-08-13 Microsoft Technology Licensing, Llc Navigating with a camera device
US20140330516A1 (en) * 2013-03-28 2014-11-06 Linkedln Corporation Navigating with a camera device
US9797738B2 (en) 2013-03-28 2017-10-24 Linkedin Corporation Navigating with a camera device
US9389087B2 (en) * 2013-03-28 2016-07-12 Linkedin Corporation Navigating with a camera device
JP2017010284A (en) * 2015-06-22 2017-01-12 オカムラ印刷株式会社 Information processing system, information processing device, information processing program, portable terminal device, and its control program
JP2017010285A (en) * 2015-06-22 2017-01-12 オカムラ印刷株式会社 Information processing system, information processing program, information processing apparatus, and control program thereof
US20190145788A1 (en) * 2016-05-04 2019-05-16 Tomtom Navigation B.V. Methods and Systems for Determining Safe Return Range
US11802775B2 (en) * 2016-05-04 2023-10-31 Tomtom Navigation B.V. Methods and systems for determining safe return range
US20220121845A1 (en) * 2020-10-20 2022-04-21 Aero Record Technologies Inc. Systems and methods for dynamic digitization and extraction of aviation-related data
US11521408B2 (en) * 2020-10-20 2022-12-06 Aero Record Technologies Inc. Systems and methods for dynamic digitization and extraction of aviation-related data
CN114253261A (en) * 2021-12-08 2022-03-29 广州极飞科技股份有限公司 Path generation method, job control method and related device

Also Published As

Publication number Publication date
EP1569184A2 (en) 2005-08-31
EP1003017A3 (en) 2000-10-11
EP1628277A3 (en) 2008-03-26
EP1003017A2 (en) 2000-05-24
US20020099499A1 (en) 2002-07-25
EP1628277A2 (en) 2006-02-22
US6336072B1 (en) 2002-01-01
US6748316B2 (en) 2004-06-08
DE69936500T2 (en) 2008-03-06
DE69936500D1 (en) 2007-08-23
US20020103597A1 (en) 2002-08-01
JP2000215211A (en) 2000-08-04
EP1003017B1 (en) 2007-07-11
EP1569184A3 (en) 2014-05-07
JP3548459B2 (en) 2004-07-28
US6697731B2 (en) 2004-02-24

Similar Documents

Publication Publication Date Title
US6748316B2 (en) Apparatus and method for presenting navigation information based on instructions described in a script
US10860986B2 (en) Schedule management apparatus
US5948040A (en) Travel reservation information and planning system
US6381534B2 (en) Navigation information presenting apparatus and method thereof
US20090240429A1 (en) Method for route planning on a navigation system including points of interest
US20040225416A1 (en) Data creation apparatus
JP5038644B2 (en) Navigation system, route search server, terminal device, and advertisement display method
WO1998035311A9 (en) Travel reservation and information planning system
KR101724211B1 (en) System and method for providing smart travel services
US20040158389A1 (en) Information display system
US20050080554A1 (en) Area information provision system and method
JP6098302B2 (en) Navigation system, navigation method, and navigation program
JP2002054940A (en) Travel plan assisting system and information storing medium readable by computer
JP2000315293A (en) Automatic contact system, user terminal and server
JP2003228798A (en) Transmitter and method for transmitting moving information
JP4531776B2 (en) Navigation system, route search server, route search method, and terminal device
JP4297337B2 (en) GUIDANCE INFORMATION PROVIDING DEVICE, GUIDANCE INFORMATION PROVIDING METHOD, AND GUIDANCE INFORMATION DISPLAY METHOD
JP3884418B2 (en) Operation management apparatus using guidance script, operation management method using guidance script, and operation management program recording medium
JP2004246878A (en) Network utilizing entertainment system, market research method, and advertisement information presentation method
JP5531842B2 (en) Information providing system, information providing apparatus, information providing server, information providing method, and program
JP3943534B2 (en) Time adjustment device during movement using script for guidance, time adjustment method during movement using script for guidance, and recording medium for time adjustment program during movement
JP2005038447A (en) Moving information processor and moving information processing method
JP5109675B2 (en) Information providing apparatus, information providing method, and program
JP2006286018A (en) User terminal
JP4113909B1 (en) Route search system, route search server, and route search method

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAKAYAMA, KUNIHARU;SEKIGUCHI, MINORU;NAITO, HIROHISA;AND OTHERS;REEL/FRAME:010243/0907

Effective date: 19990812

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12