US20100256981A1 - Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings - Google Patents

Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings Download PDF

Info

Publication number
US20100256981A1
US20100256981A1 US12/753,687 US75368710A US2010256981A1 US 20100256981 A1 US20100256981 A1 US 20100256981A1 US 75368710 A US75368710 A US 75368710A US 2010256981 A1 US2010256981 A1 US 2010256981A1
Authority
US
United States
Prior art keywords
user input
incident
image
geo
marked
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/753,687
Inventor
Steven Nielsen
Curtis Chambers
Jeffrey Farr
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CertusView Techonologies LLC
Original Assignee
CertusView Techonologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CertusView Techonologies LLC filed Critical CertusView Techonologies LLC
Priority to US12/753,687 priority Critical patent/US20100256981A1/en
Assigned to CERTUSVIEW TECHNOLOGIES, LLC reassignment CERTUSVIEW TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NIELSEN, STEVEN, CHAMBERS, CURTIS, FARR, JEFFREY
Assigned to CERTUSVIEW TECHNOLOGIES, LLC reassignment CERTUSVIEW TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NIELSEN, STEVEN, CHAMBERS, CURTIS, FARR, JEFFREY
Assigned to CERTUSVIEW TECHNOLOGIES, LLC reassignment CERTUSVIEW TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NIELSEN, STEVEN, CHAMBERS, CURTIS, FARR, JEFFREY
Publication of US20100256981A1 publication Critical patent/US20100256981A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/50Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the display information being shared, e.g. external display, data transfer to other traffic participants or centralised traffic controller
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station

Definitions

  • incidents that are not part of the standard business practice may take place and cause interruption to the business operation. Such incidents can potentially reduce the quality of the services or products of the business, and sometimes may impose civil or even criminal liabilities on the business.
  • the particular types of incidents that are disruptive may depend on the nature of the business. For example, in field service applications incidents to be reported may include personal injury events, vehicle accidents, and/or any types of property damage events that may occur in the field, and the like.
  • Reports may contain sensitive or confidential information that should be viewed only by authorized entities.
  • the necessary access control can be difficult to implement or enforce due to the lack of effective measures to prevent unauthorized access to the documents or other factors such as distribution errors.
  • Applicants have recognized that a need exists for improved ways of creating, distributing, and/or retrieving reports, such as, but not limited to, personal injury reports, vehicle accident reports, any types of damage reports, and the like.
  • various embodiments of the present invention are directed to methods, apparatus, and systems for documenting events via time-elapsed geo-referenced electronic drawings.
  • incidents such as property damage and personal injury
  • one or more drawings may be provided that are referenced to a geographic location and/or that in some way indicate (to scale) the actual environment in which incidents have occurred.
  • drawings may be provided to scale, include accurate directional and positional information, and/or include representations of various environmental landmarks (e.g., trees, buildings, poles, fire hydrants, barriers, any structures, etc).
  • Examples of reports that may include one or more geo-referenced electronic drawings according to various inventive embodiments disclosed herein include, but are not limited to, personal injury reports, vehicle accident reports, and any types of damage reports.
  • one embodiment described herein is directed to an apparatus for documenting an incident at an incident site.
  • the apparatus comprises a communication interface; a display device; at least one user input device; a memory to store processor-executable instructions; and a processing unit coupled to the communication interface, the display device, the at least one user input device, and the memory.
  • the processing unit Upon execution of the processor-executable instructions by the processing unit, the processing unit: controls the communication interface to electronically receive source data representing at least one input image of a geographic area including the incident site; controls the display device to display at least a portion of the at least one input image; acquires user input from the at least one user input device to provide a representation of at least a portion of the incident on the displayed image; automatically acquires time and/or date information indicating a time and/or date that the user input was acquired; generates a marked-up digital image including the representation of at least a portion of the incident based on the user input; further controls the communication interface and/or the memory to electronically transmit and/or electronically store information relating to the marked-up digital image so as to document the incident with respect to the geographic area; and further controls the communication interface and/or the memory to electronically transmit and/or electronically store the time and/or date information in association with the information relating to the marked-up digital image so as to document when the representation of the at least a portion of the incident was created.
  • Another embodiment is directed to a method for documenting an incident at an incident site.
  • the method comprises: A) electronically receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) adding to the at least a portion of the at least one input image, based on user input received via at least one user input device associated with the display device, a representation of at least a portion of the incident to thereby generate a marked-up digital image; D) automatically acquiring time and/or date information indicating a time and/or date that the user input was acquired; E) electronically transmitting and/or electronically storing information relating to the marked-up digital image so as to document the incident with respect to the geographic area; and F) electronically transmitting and/or electronically storing the time and/or date information in association with the information relating to the marked-up digital image so as to document when the representation of the at least a portion of the incident was created.
  • a further embodiment is directed to at least one computer-readable medium encoded with instructions that, when executed on at least one processing unit, perform a method for documenting an incident at an incident site.
  • the method comprises: A) electronically receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) receiving user input via at least one user input device associated with the display device; D) automatically acquiring time and/or date information indicating a time and/or date that the user input was acquired; E) adding, based on the user input, a representation of at least a portion of the incident to the displayed at least one input image to thereby generate a marked-up digital image; F) electronically transmitting and/or electronically storing information relating to the marked-up digital image so as to document the incident with respect to the geographic area; and G) electronically transmitting and/or electronically storing the time and/or date information in association with the information relating to the marked-up digital image so
  • the apparatus comprises a communication interface; a display device; at least one user input device; a memory to store processor-executable instructions; and a processing unit coupled to the communication interface, the display device, the at least one user input device, and the memory.
  • the processing unit Upon execution of the processor-executable instructions by the processing unit, the processing unit: controls the communication interface to electronically receive source data representing at least one input image of a geographic area including the incident site; controls the display device to display at least a portion of the at least one input image; acquires first user input from the at least one user input device to provide a first representation of at least a portion of the incident at a first time on the at least one input image; generates a first marked-up digital image including the first representation based on the first user input; acquires second user input from the at least one user input device to provide a second representation of at least a portion of the incident at a second time on the at least one input image; generates a second marked-up digital image including the second representation based on the second user input; and further controls the communication interface and/or the memory to electronically transmit and/or electronically store information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
  • a further embodiment is directed to a method for documenting an incident at an incident site.
  • the method comprises: A) receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) receiving first user input via at least one user input device associated with the display device; D) processing the first user input so as to display, on the display device, a first marked-up digital image including a first representation of at least a portion of the incident at a first time on the at least one input image; E) receiving second user input via the at least one user input device; F) processing the second user input so as to display, on the display device, a second marked-up digital image including a second representation of at least a portion of the incident at a second time on the at least one input image; and G) electronically transmitting and/or electronically storing information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
  • Another embodiment is directed to at least one computer-readable medium encoded with instructions that, when executed on at least one processing unit, perform a method for documenting an incident at an incident site.
  • the method comprises: A) receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) receiving first user input via at least one user input device associated with the display device; D) processing the first user input so as to display, on the display device, a first marked-up digital image including a first representation of at least a portion of the incident at a first time on the at least one input image; E) receiving second user input via the at least one user input device; F) processing the second user input so as to display, on the display device, a second marked-up digital image including a second representation of at least a portion of the incident at a second time on the at least one input image; and G) electronically transmitting and/or electronically storing information relating to the first and second
  • FIG. 1 illustrates a functional block diagram of a geo-referenced electronic drawing application for documenting and reporting events, according to the present disclosure
  • FIG. 2 illustrates an example of a drawing tool GUI of the geo-referenced electronic drawing application, according to the present disclosure
  • FIG. 3 illustrates an example of a series of geo-referenced drawings that are generated using the geo-referenced electronic drawing application, according to the present disclosure
  • FIG. 4 illustrates an example of a report that is generated using the geo-referenced electronic drawing application and that includes a geo-referenced drawing, according to the present disclosure
  • FIG. 5 illustrates a flow diagram of an example of a method of operation of the geo-referenced electronic drawing application, according to the present disclosure
  • FIG. 6 illustrates a functional block diagram of a networked system that includes the geo-referenced electronic drawing application for documenting and reporting events, according to the present disclosure
  • FIG. 7 shows a map, representing an exemplary input image
  • FIG. 8 shows a construction/engineering drawing, representing an exemplary input image
  • FIG. 9 shows a land survey map, representing an exemplary input image
  • FIG. 10 shows a grid, overlaid on the construction/engineering drawing of FIG. 8 , representing an exemplary input image
  • FIG. 11 shows a street level image, representing an exemplary input image
  • FIG. 12 shows the drawing tool GUI of FIG. 2 displaying a layer directory pane that facilitates the manipulation of layers
  • FIG. 13 shows the drawing tool GUI of FIG. 2 displaying an animation controls window that facilitates generation of an animated sequence
  • FIG. 14 shows an illustrative computer that may be used at least in part to implement the geo-referenced electronic drawing application in accordance with some embodiments.
  • FIG. 15 shows an example of an input image constructed from bare data.
  • a geo-referenced electronic drawing application for documenting and reporting events is described herein.
  • the geo-referenced electronic drawing application may provide a mechanism for importing a geo-referenced image that may be marked up with symbols and/or any other markings for indicating the details of an event, such as a vehicle accident.
  • the geo-referenced image may include data associated therewith (e.g., embedded metadata) that allows identification of locational information (e.g., locational coordinates) for any point or region on the image.
  • locational information e.g., locational coordinates
  • a networked system that includes the geo-referenced electronic drawing application is also described.
  • geo-referenced electronic drawing application described herein is suitable for generating any type of report in which a geo-referenced image (or other image) may be useful, such as, but not limited to, personal injury reports, vehicle accident reports, any types of property damage reports, and the like.
  • the methods and apparatus described herein may be useful for providing reports that include images in various field service applications, such as, but not limited to, those of underground facilities locate companies, excavation companies, landscaping companies, tree care and removal companies, utility installation and repair companies, and the like.
  • the geo-referenced electronic drawing application described herein may provide the ability to electronically mark up real world geo-referenced images with symbols, shapes, and/or lines in order to provide improved and consistent accuracy with respect to drawings that support incident reports.
  • geo-referenced electronic drawing application described herein may provide the ability to electronically mark up real world geo-referenced images with symbols, shapes, and/or lines to scale, again providing improved and consistent accuracy with respect to drawings that support incident reports.
  • geo-referenced electronic drawing application may provide a standard symbols library, thereby providing standardization with respect to drawings that support incident reports.
  • Networked systems that include the geo-referenced electronic drawing application described herein may provide improved distribution, tracking, and auditing of reports among entities and the systems provide improved control over access to reports.
  • Geo-referenced electronic drawing application 100 may be a standalone and/or a web-based software application that allows a user to import a geo-referenced image and then mark up the image with symbols and/or any other markings for indicating the details of an event, such as a vehicle accident.
  • Geo-referenced electronic drawing application 100 may be executed by a processing unit 110 and stored in memory 112 .
  • Processing unit 110 may be any standard microcontroller or microprocessor device that is capable of executing program instructions of geo-referenced electronic drawing application 100 .
  • Memory 112 may be any standard data storage medium.
  • a symbols library 114 a collection of input images 116 , certain geo-location data 118 , and timestamp data 120 , may be stored in memory 112 .
  • Timestamp data 120 may include calendar date and/or time information. Timestamp data 120 may originate from, for example, the computing device on which geo-referenced electronic drawing application 100 is installed, any other computing device, and/or manual entry by the user.
  • Location stamp data 150 may include location information such as a city and state, zip code, or geographic coordinates. Location stamp data 150 may originate from, for example, the computing device on which geo-referenced electronic drawing application 100 is installed, any other computing device, and/or manual entry by the user.
  • Symbols library 114 , input images 116 , geo-location data 118 , timestamp data 120 and location stamp data 150 support the functions of a drawing tool graphical user interface (GUI) 122 of geo-referenced electronic drawing application 100 .
  • Drawing tool GUI 122 is suitable for presenting on the display of any computing device, such as a computing device 140 .
  • processing unit 110 retrieves a certain input image 116 that corresponds to the geographic location information and displays the input image 116 in a window of drawing tool GUI 122 .
  • Geographic location information may be, for example, a physical address, latitude and longitude coordinates, and/or any global positioning system (GPS) data.
  • GPS global positioning system
  • an input image 116 is any image represented by source data that is electronically processed (e.g., the source data is in a computer-readable format) to display the image on a display device.
  • An input image 116 may include any of a variety of paper/tangible image sources that are scanned (e.g., via an electronic scanner) or otherwise converted so as to create source data (e.g., in various formats such as XML, PDF, JPG, BMP, etc.) that can be processed to display the input image 116 .
  • An input image 116 also may include an image that originates as source data or an electronic file without necessarily having a corresponding paper/tangible copy of the image (e.g., an image of a “real-world” scene acquired by a digital still frame or video camera or other image acquisition device, in which the source data, at least in part, represents pixel information from the image acquisition device).
  • an image that originates as source data or an electronic file without necessarily having a corresponding paper/tangible copy of the image e.g., an image of a “real-world” scene acquired by a digital still frame or video camera or other image acquisition device, in which the source data, at least in part, represents pixel information from the image acquisition device.
  • input images 116 may be created, provided, and/or processed by a geographic information system (GIS) that captures, stores, analyzes, manages and presents data referring to (or linked to) location, such that the source data representing the input image 116 includes pixel information from an image acquisition device (corresponding to an acquired “real world” scene or representation thereof), and/or spatial/geographic information (“geo-encoded information”).
  • GIS geographic information system
  • one or more input images 116 may be stored in local memory 112 of the computing device 140 and/or retrieved from the optional remote computer (e.g., via the communication interface 124 ) and then stored in local memory.
  • Various information may be derived from the one or more input images for display (e.g., all or a portion of the input image, metadata associated with the input image, etc.).
  • FIG. 15 shows an example of an input image 1500 constructed from bare data.
  • input image 1500 includes a representation of a street sign 1510 , representations of traffic conditions 1512 and 1514 , and a representation of a weather condition 1516 .
  • the location of the street sign representation 1510 and traffic condition representations 1512 and 1514 may correspond to the actual locations of the street signs and traffic conditions in the region shown in the input image 1500 .
  • the location of the representation of the weather condition 1516 may be arbitrarily selected, or selected to be in a corner of the input image 1500 , as the representation may indicate that the weather condition corresponds generally to the entire region shown in the input image 1500 .
  • the type of street sign 1510 is a stop sign
  • the traffic conditions 1512 and 1514 are “construction” and “light traffic”
  • the weather condition 1516 is lightning;
  • source data representing an input image 116 may be compiled from multiple data/information sources; for example, any two or more of the examples provided above for input images and source data representing input images 116 , or any two or more other data sources, can provide information that can be combined or integrated to form source data that is electronically processed to display an image on a display device.
  • drawing tool GUI 122 of geo-referenced electronic drawing application 100 is presented.
  • drawing tool GUI 122 that may be implemented, for example, by a web browser that is presented via any networked computing device, such as computing device 140 of FIG. 1 .
  • drawing tool GUI 122 that may be implemented, for example, by a GUI window that is presented via any computing device.
  • Drawing tool GUI 122 may present a certain input image 116 that corresponds to specified geographic location information. For example, location information from geo-location data 118 may be automatically read into an address field 210 and/or a geo-location data field 212 . Alternatively, location information may be manually entered in address field 210 and/or geo-location data field 212 . In one example, input image 116 may be an aerial image that corresponds to the geographic location information. Overlaying input image 116 may be an image scale 214 . Input image 116 is read into drawing tool GUI 122 and may be oriented in the proper manner with respect to directional heading (i.e., north, south, east, and west).
  • directional heading i.e., north, south, east, and west.
  • Drawing tool GUI 122 may also include various palettes, toolbars, or other interfaces that enable the user to manipulate (e.g., zoom in, zoom out) and/or mark up input image 116 .
  • drawing tool GUI 122 may include a drawing toolbar 216 that may include a sketching palette as well as a symbols palette.
  • the sketching palette portion of drawing toolbar 216 may provide standard drawing tools that allow a user to draw certain shapes (e.g., a polygon, a rectangle, a circle, a line) atop input image 116 .
  • the symbols palette portion of drawing toolbar 216 provides a collection of any symbols that may be useful for depicting the event of interest, such as a vehicle accident.
  • the source of these symbols may be symbols library 114 .
  • symbols library 114 may include, but is not limited to, a collection of car symbols, truck symbols, other vehicle symbols (e.g., emergency vehicles, buses, farm equipment, 2-wheel vehicles, etc), landmark symbols (e.g., fire hydrants, trees, fences, poles, cross walks, various barriers, etc), symbols of signs (e.g., standard road signs, any other signs, etc), symbols of people (e.g., pedestrians), symbols of animals, and the like.
  • a user may mark up input image 116 in a manner that depicts, for example, the vehicle accident scene.
  • a vehicle collision is depicted by a vehicle # 1 and a vehicle # 2 overlaid on input image 116 .
  • the symbols for vehicle # 1 and vehicle # 2 are selected from the symbols palette portion of drawing toolbar 216 .
  • the drawing tool GUI 122 may allow a user to specify a confidence level for a selected symbol. For example, if a user selects a symbol corresponding to a bus to be overlaid on input image 116 , the user may specify an associated confidence level to indicate a degree of confidence that the observed vehicle was a bus.
  • the confidence level may be numeric, e.g., “25%,” or descriptive, e.g., “low.”
  • An indication of the confidence level or a degree of uncertainty may be displayed adjacent the corresponding symbol or may be integrated with the symbol itself. For example, a question mark or the confidence level may be displayed on or near the symbol. Additionally or alternatively, an indication of the confidence level may be included in the text of a vehicle accident report including the marked up input image.
  • the aforementioned palettes, toolbars, and/or symbols library are described in the context of preparing a vehicle accident report. However, this is exemplary only.
  • the palettes, toolbars, and/or symbols library of the geo-referenced electronic drawing application of the present disclosure may be industry-specific and/or incident type-specific. As a result, the palettes, toolbars, and/or symbols library may be selectable by the user depending on the application in which the geo-referenced electronic drawing application is being used. In one example, with respect to an incident involving tree damage and/or a tree damaging a structure, the user may select palettes, toolbars, and/or symbols that include trees and building rooflines that may be used for marking up the geo-referenced image.
  • geo-referenced electronic drawing application 100 may be designed to automatically render symbols to scale upon the geo-referenced drawing according to the settings of scale 214 . This is one example of how geo-referenced electronic drawing application 100 may provide consistent accuracy to drawings that support incident reports. Further, the presence of a standard symbols library, such as symbols library 114 , is one example of how geo-referenced electronic drawing application 100 provides standardization to drawings that support incident reports.
  • the geo-referenced electronic drawing application 100 may be configured to allow the viewing angle or perspective of the input image 116 and/or representations thereon to be changed. For example, a user may switch between an overhead view, a perspective view, and a side view. This may be accomplished by correlating corresponding points in two or more geo-referenced images, for example.
  • a symbol such as a representation of a vehicle, or other content-related marking added to an image may have three-dimensional data associated therewith to enable the symbol to be viewed from different angles.
  • its content e.g., a representation of a vehicle accident and its surrounding
  • the geo-referenced electronic drawing application 100 may be configured to allow the input image 116 to be manually or automatically modified. For example, it may be desirable to remove extraneous features, such as cars, from the input image 116 .
  • the geo-referenced electronic drawing application 100 may include shape or object recognition software that allows such features to be identified and/or removed.
  • One example of software capable of recognizing features in an image, such as an aerial image is ENVI® image processing and analysis software by ITT Corporation of White Plains, N.Y. Exemplary features that may be recognized include vehicles, buildings, roads, bridges, rivers, lakes, and fields.
  • the geo-referenced electronic drawing application 100 may be configured such that a value indicating a level of confidence that an identified object corresponds to a particular feature may optionally be displayed.
  • Automatically identified features may be automatically modified in the image in some manner.
  • the features may be blurred or colored (e.g., white, black or to resemble a color of one or more pixels adjacent the feature).
  • the geo-referenced electronic drawing application 100 may include drawing tools (e.g., an eraser tool or copy and paste tool), that allow such features to be removed, concealed, or otherwise modified after being visually recognized by a user or automatically recognized by the geo-referenced electronic drawing application 100 or associated software.
  • Drawing toolbar 216 may also allow the user to add text boxes that can be used to add textual content to input image 116 .
  • a callout 218 may be one mechanism for entering and displaying textual information about, in this example, the vehicle collision.
  • drawing tool GUI 122 may include a navigation toolbar 220 by which the user may zoom or pan input image 116 (e.g., zoom in, zoom out, zoom to, pan, pan left, pan right, pan up, pan down, etc.).
  • Navigation toolbar 220 may additionally include one or more buttons that enable user drawn shapes to be accentuated (e.g., grayscale, transparency, etc.).
  • a set of scroll controls 222 may be provided in the image display window that allows the user to scroll input image 116 north, south, east, west, and so on with respect to real world directional heading.
  • the drawing application may be configured to reposition the displayed image so that it is directionally aligned with a direction of the display screen, based on an input from a compass or other device indicative of an orientation of the display screen in the environment.
  • Overlaying input image 116 may also be a timestamp 224 and/or a location stamp 250 .
  • Timestamp 224 may indicate the creation date and/or time or a save date and/or time of a marked up input image 116 or information used to generate the marked up input image.
  • Timestamp data 120 in memory 112 of FIG. 1 may be the source of information of timestamp 224 . Such data may be based on an output of a local or remote timer, for example.
  • Location stamp 250 may indicate the location where the marked up input image 116 or information used to generate the marked up input image was saved.
  • Location stamp 250 in memory 112 of FIG. 1 may be the source of information of location stamp 250 .
  • Such data may be based on an output of GPS device, for example.
  • the timestamp 224 and location stamp 250 may be automatically generated based, for example, on the output of a timer device and GPS device as discussed above. Further, the timestamp and location stamp may be difficult or impossible for a user to modify. Thus, the timestamp and location stamp may be used to verify that the marked-up input image with which they are associated was created at an expected time and place, such as the general or specific time and place where the vehicular accident or other incident was investigated. If desired, time and/or location data may be automatically acquired several times during the creation of one or more marked-up digital images, and may be stored in association with the images, to enable verification that the user was present at the time and/or place of the investigation for some duration of time.
  • geo-referenced electronic drawing application 100 may provide improved and consistent accuracy to drawings that support incident reports.
  • the input image data and the mark up data may be displayed as separate “layers” of the visual rendering, such that a viewer of the visual rendering may turn on and turn off displayed data based on a categorization of the displayed data.
  • Respective layers may be enabled or disabled for display in any of a variety of manners.
  • a “layer directory” or “layer legend” pane 1200 may be rendered in the viewing window of drawing tool GUI 122 described in connection with FIG. 2 .
  • the layer directory pane 1200 may show all available layers, and allow a viewer to select each available layer to be either displayed or hidden, thus facilitating comparative viewing of layers.
  • the layer directory pane 1200 may be displayed by selecting a “display layer directory pane” action item in the layers menu 1202 .
  • image information is categorized generally under layer designation 1202 (“reference layer”) and may be independently enabled or disabled for display (e.g., hidden) by selecting the corresponding check box.
  • reference layer information available to be overlaid on the input image
  • symbols layer information available to be overlaid on the input image
  • the reference layer and symbols layers may have sub-categories for sub-layers, such that each sub-layer may also be selectively enabled or disabled for viewing by a viewer.
  • a “base image” sub-layer may be selected for display.
  • the base image sub-layer is merely one example of a sub-layer that may be included under the “reference layer,” as other sub-layers (e.g., “grid”) are possible.
  • symbol types that may be overlaid on the input image may be categorized under different sub-layer designations (e.g., designation 1208 for “cars layer;” designation 1212 for “trucks layer;” designation 1216 for “other vehicles layer;” designation 1218 for “landmarks layer;” and designation 1220 for “signs layer”).
  • sub-layer designations e.g., designation 1208 for “cars layer;” designation 1212 for “trucks layer;” designation 1216 for “other vehicles layer;” designation 1218 for “landmarks layer;” and designation 1220 for “signs layer”.
  • the various sub-layers may have further sub-categories for sub-layers, such that particular features within a sub-layer may also be selectively enabled or disabled for viewing by a viewer.
  • the cars layer may include a designation 1210 for “car 1 ”
  • the truck layer may include a designation 1214 for “truck 1 .”
  • information concerning the car 1222 (“car 1 ”) and truck 1224 (“truck 1 ”) involved in the accident can be selected for display.
  • both the reference and symbols layers are enabled for display.
  • the base image layer is enabled for display.
  • the symbols layer sub-layers only the cars layer and the trucks layer are enabled for display.
  • the further sub-layers “car 1 ” and “truck 1 ” are enabled for display. Accordingly, a base image is rendered in the viewing window of drawing tool GUI 122 , and only car 1222 and truck 1224 are rendered thereon.
  • any characteristic of the information available for display may serve to categorize the information for purposes of display layers or sub-layers.
  • any of the various exemplary elements that may be rendered using the drawing tool GUI 122 discussed herein e.g., timestamps; scales; callouts; estimated time information; input image content; symbols relating to vehicles, landmarks, signs, people, animals or the like, etc.
  • layers may be based on user-defined attributes of symbols or other rendered features. For example, a layer may be based on the speed of vehicles, whether vehicles were involved in the accident, whether the vehicles are public service vehicles, the location of vehicles at a particular time, and so on. For example, a user may define particular vehicle symbols as having corresponding speeds, and a “moving vehicles layer” may be selected to enable the display of vehicles having non-zero speeds. Additionally or alternatively, selecting the moving vehicles layer may cause information concerning the speed of the moving vehicles to be displayed. For example, text indicating a speed of 15 mph may be displayed adjacent a corresponding vehicle.
  • a user may define particular vehicle symbols as being involved in the accident, and an “accident vehicles layer” may be selected to enable the display of vehicles involved in the accident. Additionally or alternatively, selecting the accident vehicles layer may cause information identifying accident vehicles to be displayed. For example, an icon indicative of an accident vehicle may be displayed adjacent a corresponding vehicle.
  • the “moving vehicles layer” and the “accident vehicles” layer may be sub-layers under the symbols layer, or may be sub-layers under a “vehicle layer” (not shown), which itself is a sub-layer under the symbols layer. Further, the “moving vehicles layer” and the “accident vehicles layer” may in turn include sub-layers.
  • the “moving vehicles layer” may include a sub-layer to enable the display of all vehicles traveling east.
  • the “moving vehicles layer” may include a sub-layer to enable the display of all vehicles traveling east.
  • may also be used as the basis for defining layers.
  • the user-determined and/or automatically determined confidence levels of respective symbols may be used as the basis for defining layers.
  • a layer may be defined to include only those symbols that have an associated user-determined and/or automatically determined confidence level of at least some percentage, e.g., 50%.
  • the information concerning the confidence levels associated with the symbols may be drawn from a report in which such levels are included.
  • the attributes and/or type of visual information displayed as a result of selecting one or more layers or sub-layers is not limited.
  • visual information corresponding to a selected layer or sub-layer may be electronically rendered in the form of one or more lines or shapes (of various colors, shadings and/or line types), text, graphics (e.g., symbols or icons), and/or images, for example.
  • the visual information corresponding to a selected layer or sub-layer may include multiple forms of visual information (one or more of lines, shapes, text, graphics and/or images).
  • all of the symbols and/or other overlaid information of a particular marked up input image may be categorized as a display layer, such that the overlaid information may be selectively enabled or disabled for display as a display layer.
  • a user may conveniently toggle between the display of various related marked up input images (e.g., marked up input images relating to the same accident or other event) for comparative display.
  • a user may toggle between scenes depicting the events of an accident at different times.
  • a layer need not include a singular category of symbols or overlaid information, and may be customized according to a user's preferences. For example, a user may select particular features in one or more marked up input images that the user would like to enable to be displayed collectively as a layer. Additionally or alternatively, the user may select a plurality of categories of features that the user would like to enable to be displayed collectively as a layer.
  • processing unit 110 may automatically select which layers are displayed or hidden. As an example, if a user depicts a truck in the accident scene using a truck symbol, processing unit 110 may automatically select the “truck layer” sub-layer and the “truck 1 ” sub-sub layer for display in the display field. As another example, if a user specifies or selects landmarks to be displayed, processing unit 110 may automatically select the base image to be hidden to provide an uncluttered depiction of an accident scene.
  • the foregoing are merely illustrative examples of automatic selection/enabling of layers, and the inventive concepts discussed herein are not limited in these respects.
  • the marked up input image 116 may be saved as an event-specific image 126 .
  • any event-specific images 126 created therein may be converted to any standard digital image file format, such as PDF, JPG, and BMP file format, and saved, for example, in memory 112 or to an associated file system (not shown).
  • the multiple event-specific images 126 may be associated to one another via, for example, respective descriptor files 128 and saved as an image series 130 . An example of an image series 130 is shown with reference to FIG. 3 .
  • Each descriptor file 128 includes information about each event-specific image 126 of an image series 130 .
  • each descriptor file 128 may include the accident report number, the name of the event-specific image 126 with respect to the image series 130 , the creation date, and the like.
  • Descriptor files 128 provide a mechanism of geo-referenced electronic drawing application 100 that allow event-specific images 126 and/or any image series 130 to be queried by other applications, such as any incident management applications.
  • descriptor files 128 may be extensible markup language (XML) files that are created during the save process of event-specific images 126 and/or image series 130 .
  • XML extensible markup language
  • FIG. 3 shows an example of an image series 130 that depicts time-lapsed sequential images of a vehicle collision (i.e., essentially representing time-lapsed frames 1 , 2 , and 3 in sequence).
  • frame 1 is represented by an event-specific image 126 A that depicts vehicle # 1 heading westbound and vehicle # 2 heading eastbound, just prior to the collision.
  • Frame 2 is represented by an event-specific image 126 B that depicts vehicle # 1 and vehicle # 2 at the moment of impact during the collision.
  • Frame 3 is represented by an event-specific image 126 C that depicts the final resting place of vehicle # 1 and vehicle # 2 after the collision.
  • Each of the event-specific images 126 A-C may include a corresponding estimated relative time 225 A-C represented thereon.
  • the estimated relative time may reflect an estimated time of the event (e.g., a vehicle accident) depicted in the event-specific image.
  • an estimated relative time is rendered visually (e.g., overlaid) on the input image 116 of each of the event-specific images 126 A-C.
  • the estimated relative time 225 A is represented by a variable “t,” which corresponds to an unknown time.
  • the estimated relative times 225 B and 225 C are represented by times relative to the variable “t” (i.e., “t+0.5 sec” and “t+1 sec,” respectively). Additionally or alternatively, the estimated relative time may reflect an estimated date of the vehicle accident. As also shown in FIG. 3 , one or more of event-specific images 126 A-C may include a corresponding estimated actual time 227 represented thereon. The estimated actual time may reflect an estimated non-relative time of the vehicle accident. The estimated relative time 225 A-C and the estimated actual time 227 may be estimated by the user of the drawing application or a related party.
  • an animation controls window 1302 may be rendered in the viewing window of drawing tool GUI 122 described in connection with FIG. 2 to facilitate generation of an animated sequence.
  • the animation controls window 1302 may be displayed by selecting a “display animation controls” action item in the animation menu 1300 .
  • the animation controls window 1302 comprises an interface 1304 for specifying frame order, an interface 1306 for specifying animation speed, and an interface 1308 for specifying a transition between frames.
  • interface 1304 lists each of the frames representing event-specific images.
  • a user may specify a sequential order for the listed frames by selecting up or down arrows associated with the listed frames. For example, a user may select the down arrow associated with “Frame 1 ” to move this frame to a later sequential order.
  • Interface 1304 lists options for specifying the animation speed of the frames.
  • a first option which is selected in the example of FIG. 13 , provides that the animation speed of the frames will be based on an estimated time for each frame.
  • the animation speed may be based on an estimated relative time or an estimated actual time that may be specified for each frame as discussed in connection with FIG. 3 . For example, if Frame 2 has an estimated relative time that is two seconds after that of Frame 1 , and Frame 2 has an estimated relative time that is five seconds after that of Frame 2 , the frames may be displayed at zero seconds, two seconds, and seven seconds, respectively, or at some multiplier thereof.
  • the frames may be displayed at half of the estimated actual speed by displaying the frames at zero seconds, four seconds, and fourteen seconds, respectively.
  • a second option which is unselected in the example of FIG. 13 , provides that the animation speed of the frames will be based on a regular interval, the length of which may be adjusted by sliding the arrow associated with the interval length control feature to the left or the right.
  • the animation speed need not be consistent for all frames, and that the animation speed for particular sequences of frames may be adjusted as desired by the user.
  • the time associated with one or more frames may be increased or decreased relative to an estimated time so that the user can observe how such an increase or decrease impacts the animation and/or simulate different scenarios.
  • Interface 1306 lists options for specifying a transition between overlaid features in the frames (e.g., vehicle symbols).
  • a first option which is selected in the example of FIG. 13 , provides that there is no transition between the overlaid features. In this case, the frames will simply be displayed sequentially as a time-lapse animation.
  • a second option which is unselected in the example of FIG. 13 , provides that the overlaid features (e.g., vehicle symbols) will trace a path between their position in consecutive frames to transition between the event-specific images of consecutive frames.
  • the path may be a linear path between a first point representing a center of the vehicle symbol in one image and a second point representing a center of the vehicle symbol in a consecutive image.
  • all overlaid features or all overlaid features of a particular type will, by default, be animated in this manner.
  • an additional interface may be displayed that allows a user to select which overlaid features are to be animated and which are to remain stationary. For example, a user may specify that only features belonging to a certain custom or non-custom layer be animated while all other features remain stationary. Conversely, a user may specify that only features belonging to a certain custom or non-custom layer shall remain stationary while all other features are animated.
  • This additional interface may or may not be used in connection with default settings.
  • geo-referenced electronic drawing application 100 provides a mechanism by which event-specific images 126 and/or any image series 130 or animation based thereon may be integrated into electronic reports, such as reports 132 of FIG. 1 .
  • Reports 132 may be any electronic reports in which geo-referenced electronic drawings may be useful, such as electronic personal injury reports, electronic vehicle accident reports, any types of electronic property damage reports, and the like.
  • An example of a report 132 is shown with reference to FIG. 4 .
  • Traffic collision report 400 that is generated using geo-referenced electronic drawing application 100 and that includes a geo-referenced drawing is presented.
  • Traffic collision report 400 is an example of a report 132 .
  • Traffic collision report 400 may be, for example, a report used by accident investigation companies, law enforcement agencies, and/or insurance companies.
  • a certain event-specific image 126 is read into a drawing field of traffic collision report 400 .
  • the textual information of traffic collision report 400 may be manually entered and/or automatically imported from information associated with event-specific image 126 , which was captured using drawing tool GUI 122 .
  • a “Description of Accident” field may be populated with textual information of callout 218 (see FIG. 2 ).
  • an entry screen (not shown) of geo-referenced electronic drawing application 100 may be provided that allows the user to manually enter and/or modify information in the text fields of a report 132 , such as the text fields of traffic collision report 400 .
  • the entry screen may be incorporated in and/or operate in combination with drawing tool GUI 122 .
  • a report 132 such as traffic collision report 400 , is not limited to incorporating a single event-specific image 126 only.
  • subsequent pages of traffic collision report 400 may include all event-specific images 126 of a certain image series 130 , such as those shown in FIG. 3 .
  • Method 500 may include, but is not limited to, the following steps, which are not limited to any order.
  • processing unit 110 of geo-referenced electronic drawing application 100 acquires location information with respect to the event of interest.
  • location information from geo-location data 118 may be automatically read into address field 210 and/or geo-location data field 212 of drawing tool GUI 122 .
  • location information may be manually entered in address field 210 and/or geo-location data field 212 .
  • step 512 the collection of geo-referenced images is queried, the matching geo-referenced image is read into drawing tool GUI 122 , and the geo-referenced image is rendered in the viewing window of drawing tool GUI 122 .
  • processing unit 110 of geo-referenced electronic drawing application 100 queries input images 116 , which are the geo-referenced images, in order to find the input image 116 that matches the location information of step 510 .
  • the input image 116 is read into drawing tool GUI 122 and rendered in the viewing window thereof.
  • a geo-referenced image is provided to the user, upon which markings that indicate the event of interest may be made.
  • an input image 116 that matches “263 Main St, Reno, Nev.” in address field 210 is located in the store of input images 116 in memory 112 and then read into drawing tool GUI 122 .
  • processing unit 110 of geo-referenced electronic drawing application 100 may process any symbols that are selected from symbols library 114 along with any other markings that are overlaid upon the geo-referenced image to depict the event of interest.
  • any symbols that are selected using drawing toolbar 216 of drawing tool GUI 122 may be overlaid upon the certain input image 116 in order to the depict event of interest, such as a vehicle accident.
  • a symbol for a car (vehicle # 1 ) and a symbol for a light truck (vehicle # 2 ) are positioned and rendered upon the input image 116 that matches “263 Main St, Reno, Nev.” in address field 210 .
  • geo-referenced electronic drawing application 100 is designed to automatically render symbols to scale upon the certain input image 116 according to the settings of scale 214 .
  • markings e.g., a polygon, a rectangle, a circle, a line
  • lines to indicate skid marks may be drawn upon input image 116 .
  • processing unit 110 of geo-referenced electronic drawing application 100 may process any textual information related to the geo-referenced image.
  • callout 218 may be used for entering and displaying textual information about the vehicle collision. Callout 218 is shown overlaid upon input image 116 .
  • processing unit 110 of geo-referenced electronic drawing application 100 may render and save the event-specific image along with its associated descriptor file.
  • the marked up input image 116 may be saved as an event-specific image 126 .
  • any event-specific images 126 created therein may be converted to any standard digital image file format, such as PDF, JPG, and BMP file format, and saved. Further, its associated descriptor file 128 is created and saved.
  • the user of geo-referenced electronic drawing application 100 determines whether an image series, such as the example image series 130 of FIG. 3 , is required in order to adequately depict the event of interest. If yes, method 500 proceeds to step 522 . If no, method 500 proceeds to step 526 .
  • step 522 the user of geo-referenced electronic drawing application 100 determines whether the image series is complete. If yes, method 500 proceeds to step 524 . If no, method 500 returns to step 510 to begin creating the next event-specific image.
  • the descriptor files 128 of the event-specific images 126 that are included in the image series 130 are associated and the image series 130 is saved.
  • the event-specific image 126 and/or all event-specific images 126 of the image series 130 and any other information are integrated into the electronic report of interest.
  • a certain event-specific image 126 is integrated into a certain type of report 132 , such as traffic collision report 400 of FIG. 4 .
  • textual information associated with event-specific image 126 may be automatically imported into traffic collision report 400 .
  • networked system 600 that includes geo-referenced electronic drawing application 100 for documenting and reporting events is presented.
  • geo-referenced electronic drawing application 100 may be a web-based application. Therefore, networked system 600 may include an application server 610 upon which geo-referenced electronic drawing application 100 is installed.
  • Application server 610 may be any application server, such as a web application server and/or web portal, by which one or more user 612 may access geo-referenced electronic drawing application 100 with respect to documenting and reporting events.
  • Application server 610 may be accessed by users 612 via any networked computing device, such as his/her local computing device 140 .
  • users 612 may be any personnel associated with accident investigation companies, law enforcement agencies, and/or insurance companies.
  • Networked system 600 of the present disclosure may further include an image server 614 , which is one example of an entity supplying input images 116 of FIG. 1 .
  • Image server 614 may be any computer device for storing and providing input images 116 , such as aerial images of geographic locations.
  • Networked system 600 of the present disclosure may further include a central server 616 .
  • central server 616 may be associated with accident investigation companies, law enforcement agencies, and/or insurance companies.
  • Certain business applications, such as management applications 618 may reside on central server 616 .
  • Management applications 618 may be, for example, any incident management applications.
  • a network 620 provides the communication link between any and/or all entities of networked system 600 .
  • network 620 provides the communication network by which information may be exchanged between application server 610 , image server 614 , central server 616 , and computing devices 140 .
  • Network 620 may be, for example, any local area network (LAN) and/or wide area network (WAN) for connecting to the Internet.
  • LAN local area network
  • WAN wide area network
  • each entity of networked system 600 includes a communication interface (not shown).
  • the respective communication interfaces of application server 610 , image server 614 , central server 616 , and computing devices 140 may be any wired and/or wireless communication interface by which information may be exchanged between any entities of networked system 600 .
  • Examples of wired communication interfaces may include, but are not limited to, USB ports, RS232 connectors, RJ45 connectors, Ethernet, and any combinations thereof.
  • wireless communication interfaces may include, but are not limited to, an Intranet connection, Internet, Bluetooth® technology, Wi-Fi, Wi-Max, IEEE 802.11 technology, radio frequency (RF), Infrared Data Association (IrDA) compatible protocols, Local Area Networks (LAN), Wide Area Networks (WAN), Shared Wireless Access Protocol (SWAP), any combinations thereof, and other types of wireless networking protocols.
  • RF radio frequency
  • IrDA Infrared Data Association
  • LAN Local Area Networks
  • WAN Wide Area Networks
  • SWAP Shared Wireless Access Protocol
  • geo-referenced electronic drawing application 100 may include a feature for attaching media files to reports 132 .
  • networked system 600 may include certain media capture devices 622 for capturing media files 624 .
  • Media capture devices 622 may be any media capture devices, such as digital cameras, digital audio recorders, digital video recorders, and the like. Therefore, media files 624 may be, for example, digital image files, digital audio files, digital video files, and the like.
  • the media files 624 may likewise have descriptor files (not shown) associated therewith for, for example, associating to certain reports 132 .
  • the media files 624 may be provided as attachments to reports 132 .
  • computing device 140 may include one or more media capture devices as described above.
  • the attached media files 624 may be stamped with time, location and/or direction information.
  • a media file 624 may include a timestamp identifying a calendar date and/or time that the media file was created and/or a calendar date and/or time that the media file was stored in memory by the computing device 140 .
  • the media file may include a location stamp identifying a location (e.g., a city and state or geographic coordinates) where the media file was created and/or a location where the media file was stored in memory by the computing device 140 .
  • a media file may also include a direction stamp specifying directional information associated therewith.
  • the media file is a photographic image or video that was taken with a camera device associated with a compass
  • the photographic image or video may be stamped with directional information based on an output of the compass to indicate that the image or video was taken while the camera lens was facing northwest.
  • the media files 624 may be automatically stamped with time, location and/or direction information.
  • the timestamp and location stamp particularly when automatically generated, may be used as verification that the media file was stored at a particular time and place, such as the time and place where the report associated with the media file was created.
  • the direction stamp may be used as verification that the media file was created while a media capture device was facing in a particular direction or otherwise had a particular orientation.
  • the location, time and/or direction data used for the location stamp, timestamp and/or direction stamp may originate from the computing device on which geo-referenced electronic drawing application is installed, any other computing device.
  • the computing device may be GPS-enabled and may include a timer and a compass.
  • the location, time and/or direction data may be based on manual data entry by the user.
  • the media file need not be modified to include the location, time and/or direction data described above, as the data may alternatively be stored in association with the media file as distinct data.
  • the computing device 140 shown in FIG. 6 may have a communication interface that may receive information from network 620 , which may be a LAN and/or WAN for connecting to the Internet.
  • information about an environmental condition may be received as a media file via the communication interface.
  • weather information e.g., temperature, visibility and precipitation information
  • traffic information and/or construction information may be received from the Internet via the communication interface.
  • Such information may be received from a weather service, traffic service, traffic records, construction service or the like.
  • Received information may be attached as files to reports 132 . Alternatively, or in addition, received information may incorporated within the reports 132 themselves.
  • the received information indicates that the weather at the time of an accident was sunny
  • the report could include this information as text in a data field, or an event-specific image 126 in the report could include an image of a sun or another icon indicating sunny weather.
  • the report could include this information as text in a data field and/or represent this information in an event-specific image 126 .
  • the area beyond a 20 foot radius of the driver in the event-specific image 126 could be colored gray, blacked out, or designated with hash marks.
  • the traffic collision report 400 could be manually updated to include weather information, traffic information, construction information, or the like.
  • Condition information received via the communication interface may be stored with and/or stamped with location, time and/or direction data indicating when the condition information was stored by the computing device 140 .
  • central server 616 of networked system 600 may include a collection of historical reports 626 , which are records of reports 132 that have been processed in the past.
  • historical reports 626 may be useful to inform current reports 132 , such as current accident reports that are being processed. For example, being able to review historical information pertaining to a certain intersection may be useful to add to an accident report for fault analysis purposes, as certain trends may become apparent.
  • historical reports 626 may indicate for a certain highway or street intersection that a steep hill is present, the traffic light malfunctions, the line of site to the stop sign is obstructed, there is a poor angle of visibility at the intersection, the intersection is an accident prone area in poor weather conditions (e.g., a bridge approaching the intersection freezes over), and the like.
  • information from historical reports 626 may be other information that may be integrated into reports 132 .
  • each user of networked system 600 may access geo-referenced electronic drawing application 100 via his/her local computing device 140 .
  • Networked system 600 may provide a secure login function, which allows users 612 to access the functions of geo-referenced electronic drawing application 100 .
  • users 612 Once authorized, users 612 may open drawing tool GUI 122 using, for example, the web browsers of their computing devices 140 .
  • Geographic location information is read into or manually entered into drawing tool GUI 122 and event-specific images 126 , image series 130 , and/or reports 132 may be generated as described with reference to FIGS. 1 through 5 .
  • input images 116 of image server 614 may be the source of the geo-referenced images that are read into geo-referenced electronic drawing application 100 .
  • reports 132 that include geo-referenced images may be transmitted in electronic form from the computing devices 140 of users 612 to any entities connected to network 620 of networked system 600 .
  • reports 132 that include geo-referenced images may be transmitted in electronic form from the computing devices 140 of users 612 to central server 616 for further review and processing by authorized users only of networked system 600 .
  • networked system 600 is not limited to the types and numbers of entities that are shown in FIG. 6 . Any types and numbers of entities that may be useful in event documenting and reporting systems may be included in networked system 600 . Further, in another embodiment, geo-referenced electronic drawing application 100 may be a standalone application that resides on each networked computing device 140 . Therefore, in this embodiment, networked system 600 of FIG. 6 need not include application server 610 .
  • geo-referenced electronic drawing application 100 of the present disclosure provides the ability to electronically mark up real world geo-referenced images, such as input images 116 , with symbols, shapes, and/or lines in order to provide improved accuracy and consistent accuracy with respect to drawings that support incident reports.
  • geo-referenced electronic drawing application 100 of the present disclosure provides the ability to electronically mark up real world geo-referenced images with symbols, shapes, and/or lines to scale, again providing improved accuracy and consistent accuracy with respect to drawings that support incident reports.
  • geo-referenced electronic drawing application 100 of the present disclosure provides a standard symbols library, such as symbols library 114 , thereby providing standardization with respect to drawings that support incident reports.
  • networked systems that include geo-referenced electronic drawing application 100 of the present disclosure, such as networked system 600 , provide improved distribution, tracking, and auditing of reports among entities and provide improved control over access to reports.
  • inventive embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed.
  • inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein.
  • the above-described embodiments can be implemented in any of numerous ways.
  • the embodiments may be implemented using hardware, software or a combination thereof.
  • the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
  • a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
  • PDA Personal Digital Assistant
  • a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.
  • Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet.
  • networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • FIG. 14 shows an illustrative computer 1400 that may be used at least in part to implement the geo-referenced electronic drawing application 100 described herein in accordance with some embodiments.
  • the computer 1400 comprises a memory 1410 , one or more processing units 1412 (also referred to herein simply as “processors”), one or more communication interfaces 1414 , one or more display units 1416 , and one or more user input devices 1418 .
  • the memory 1410 may comprise any computer-readable media, and may store computer instructions (also referred to herein as “processor-executable instructions”) for implementing the various functionalities described herein.
  • the processing unit(s) 1412 may be used to execute the instructions.
  • the communication interface(s) 1414 may be coupled to a wired or wireless network, bus, or other communication means and may therefore allow the computer 1400 to transmit communications to and/or receive communications from other devices.
  • the display unit(s) 1416 may be provided, for example, to allow a user to view various information in connection with execution of the instructions.
  • the user input device(s) 1418 may be provided, for example, to allow the user to make manual adjustments, make selections, enter data or various other information, and/or interact in any of a variety of manners with the processor during execution of the instructions.
  • the various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
  • inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above.
  • the computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • data structures may be stored in computer-readable media in any suitable form.
  • data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields.
  • any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
  • inventive concepts may be embodied as one or more methods, of which an example has been provided.
  • the acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
  • a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • “at least one of A and B” can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

Abstract

One or more electronic drawings may be generated to document and/or report an event, in which various elements of the drawing(s) include geographic reference information. A symbols library, a collection of images (e.g., geo-referenced images), geo-location data, and time and location data may be stored in memory for use in connection with such drawings, and a drawing tool graphical user interface (GUI) may be provided for electronically marking-up images on which one or more drawings are based. The marked-up images may be event-specific images, and may be integrated into various types of electronic reports for accurately depicting events of interest, such as personal injury events, vehicle accidents, and/or property damage events.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application claims a priority benefit, under 35 U.S.C. §119(e), to U.S. provisional patent application Ser. No. 61/166,385, entitled “Geo-Referenced Electronic Drawing Application for Documenting and Reporting Events,” filed on Apr. 3, 2009 under attorney docket no. D0687.70030US00.
  • This application also claims a priority benefit, under 35 U.S.C. §119(e), to U.S. provisional patent application Ser. No. 61/166,392, entitled “Data Acquisition System for and Method of Analyzing Vehicle Data for Generating an Electronic Representation of Vehicle Operations,” filed on Apr. 3, 2009 under attorney docket no. D0687.70032US00.
  • Each the above-identified applications is incorporated herein by reference.
  • BACKGROUND
  • In any business setting, incidents that are not part of the standard business practice may take place and cause interruption to the business operation. Such incidents can potentially reduce the quality of the services or products of the business, and sometimes may impose civil or even criminal liabilities on the business. For any given business, the particular types of incidents that are disruptive may depend on the nature of the business. For example, in field service applications incidents to be reported may include personal injury events, vehicle accidents, and/or any types of property damage events that may occur in the field, and the like.
  • Currently, systems have been implemented for reporting and managing certain incidents. Using the example of vehicle accidents, upon arrival at the scene of a vehicle accident, a police officer or other investigator usually fills out a paper accident report explaining in detail the accident scene. As part of this report, the police officer or other investigator may attempt to draw a sketch of the accident scene on a diagram of the road, which is to be submitted with the paper accident report. However, a drawback of these paper-based reports, which may be handwritten and may include hand sketches, is that the content thereof may be inconsistent, sloppy, illegible, inaccurate, and/or incomplete. As a result, incidents, such as vehicle accidents, may be poorly documented. Once created, the accident reports are distributed to responsible entities for review, such as to accident investigation companies, law enforcement agencies, insurance companies, and any supervisory and/or management personnel. Similar processes may exist with respect to handling personal injury reports and property damage reports.
  • SUMMARY
  • Applicants have recognized and appreciated that in conventional reporting systems, a major issue is the distribution of reports and tracking of the progress of the reviews to ensure timely resolution of the events. Depending on the types of events and other factors, different reports may have to be reviewed by different entities. The existence of multiple review routing paths can be rather confusing, making it difficult to ensure that the paper report is routed to the right entities in the right order. Moreover, paper reports may be misplaced or lost during transit to the different entities and the exact status of reports may be hard to determine. Further, a drawback of conventional reporting systems is that reports and, in particular paper reports, may not be in a form that is easy to retrieve for, for example, historical reference.
  • Another concern regarding conventional reporting systems is the lack of effective control over the access to the reports. Reports may contain sensitive or confidential information that should be viewed only by authorized entities. The necessary access control, however, can be difficult to implement or enforce due to the lack of effective measures to prevent unauthorized access to the documents or other factors such as distribution errors.
  • Therefore, Applicants have recognized that a need exists for improved ways of creating, distributing, and/or retrieving reports, such as, but not limited to, personal injury reports, vehicle accident reports, any types of damage reports, and the like.
  • In view of the foregoing, various embodiments of the present invention are directed to methods, apparatus, and systems for documenting events via time-elapsed geo-referenced electronic drawings. With respect to incidents, such as property damage and personal injury, that may be reported in field service applications, in exemplary embodiments one or more drawings may be provided that are referenced to a geographic location and/or that in some way indicate (to scale) the actual environment in which incidents have occurred. In various aspects, drawings may be provided to scale, include accurate directional and positional information, and/or include representations of various environmental landmarks (e.g., trees, buildings, poles, fire hydrants, barriers, any structures, etc). Examples of reports that may include one or more geo-referenced electronic drawings according to various inventive embodiments disclosed herein include, but are not limited to, personal injury reports, vehicle accident reports, and any types of damage reports.
  • In sum, one embodiment described herein is directed to an apparatus for documenting an incident at an incident site. The apparatus comprises a communication interface; a display device; at least one user input device; a memory to store processor-executable instructions; and a processing unit coupled to the communication interface, the display device, the at least one user input device, and the memory. Upon execution of the processor-executable instructions by the processing unit, the processing unit: controls the communication interface to electronically receive source data representing at least one input image of a geographic area including the incident site; controls the display device to display at least a portion of the at least one input image; acquires user input from the at least one user input device to provide a representation of at least a portion of the incident on the displayed image; automatically acquires time and/or date information indicating a time and/or date that the user input was acquired; generates a marked-up digital image including the representation of at least a portion of the incident based on the user input; further controls the communication interface and/or the memory to electronically transmit and/or electronically store information relating to the marked-up digital image so as to document the incident with respect to the geographic area; and further controls the communication interface and/or the memory to electronically transmit and/or electronically store the time and/or date information in association with the information relating to the marked-up digital image so as to document when the representation of the at least a portion of the incident was created.
  • Another embodiment is directed to a method for documenting an incident at an incident site. The method comprises: A) electronically receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) adding to the at least a portion of the at least one input image, based on user input received via at least one user input device associated with the display device, a representation of at least a portion of the incident to thereby generate a marked-up digital image; D) automatically acquiring time and/or date information indicating a time and/or date that the user input was acquired; E) electronically transmitting and/or electronically storing information relating to the marked-up digital image so as to document the incident with respect to the geographic area; and F) electronically transmitting and/or electronically storing the time and/or date information in association with the information relating to the marked-up digital image so as to document when the representation of the at least a portion of the incident was created.
  • A further embodiment is directed to at least one computer-readable medium encoded with instructions that, when executed on at least one processing unit, perform a method for documenting an incident at an incident site. The method comprises: A) electronically receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) receiving user input via at least one user input device associated with the display device; D) automatically acquiring time and/or date information indicating a time and/or date that the user input was acquired; E) adding, based on the user input, a representation of at least a portion of the incident to the displayed at least one input image to thereby generate a marked-up digital image; F) electronically transmitting and/or electronically storing information relating to the marked-up digital image so as to document the incident with respect to the geographic area; and G) electronically transmitting and/or electronically storing the time and/or date information in association with the information relating to the marked-up digital image so as to document at last generally when the representation of the at least a portion of the incident was created.
  • Another embodiment is directed to an apparatus for documenting an incident at an incident site. The apparatus comprises a communication interface; a display device; at least one user input device; a memory to store processor-executable instructions; and a processing unit coupled to the communication interface, the display device, the at least one user input device, and the memory. Upon execution of the processor-executable instructions by the processing unit, the processing unit: controls the communication interface to electronically receive source data representing at least one input image of a geographic area including the incident site; controls the display device to display at least a portion of the at least one input image; acquires first user input from the at least one user input device to provide a first representation of at least a portion of the incident at a first time on the at least one input image; generates a first marked-up digital image including the first representation based on the first user input; acquires second user input from the at least one user input device to provide a second representation of at least a portion of the incident at a second time on the at least one input image; generates a second marked-up digital image including the second representation based on the second user input; and further controls the communication interface and/or the memory to electronically transmit and/or electronically store information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
  • A further embodiment is directed to a method for documenting an incident at an incident site. The method comprises: A) receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) receiving first user input via at least one user input device associated with the display device; D) processing the first user input so as to display, on the display device, a first marked-up digital image including a first representation of at least a portion of the incident at a first time on the at least one input image; E) receiving second user input via the at least one user input device; F) processing the second user input so as to display, on the display device, a second marked-up digital image including a second representation of at least a portion of the incident at a second time on the at least one input image; and G) electronically transmitting and/or electronically storing information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
  • Another embodiment is directed to at least one computer-readable medium encoded with instructions that, when executed on at least one processing unit, perform a method for documenting an incident at an incident site. The method comprises: A) receiving source data representing at least one input image of a geographic area including the incident site; B) processing the source data so as to display at least a portion of the at least one input image on a display device; C) receiving first user input via at least one user input device associated with the display device; D) processing the first user input so as to display, on the display device, a first marked-up digital image including a first representation of at least a portion of the incident at a first time on the at least one input image; E) receiving second user input via the at least one user input device; F) processing the second user input so as to display, on the display device, a second marked-up digital image including a second representation of at least a portion of the incident at a second time on the at least one input image; and G) electronically transmitting and/or electronically storing information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
  • The following U.S. published applications are hereby incorporated herein by reference:
  • U.S. publication no. 2008-0228294-A1, published Sep. 18, 2008, filed Mar. 13, 2007, and entitled “Marking System and Method With Location and/or Time Tracking;”
  • U.S. publication no. 2008-0245299-A1, published Oct. 9, 2008, filed Apr. 4, 2007, and entitled “Marking System and Method;”
  • U.S. publication no. 2009-0013928-A1, published Jan. 15, 2009, filed Sep. 24, 2008, and entitled “Marking System and Method;”
  • U.S. publication no. 2009-0202101-A1, published Aug. 13, 2009, filed Feb. 12, 2008, and entitled “Electronic Manifest of Underground Facility Locate Marks;”
  • U.S. publication no. 2009-0202110-A1, published Aug. 13, 2009, filed Sep. 11, 2008, and entitled “Electronic Manifest of Underground Facility Locate Marks;”
  • U.S. publication no. 2009-0201311-A1, published Aug. 13, 2009, filed Jan. 30, 2009, and entitled “Electronic Manifest of Underground Facility Locate Marks;”
  • U.S. publication no. 2009-0202111-A1, published Aug. 13, 2009, filed Jan. 30, 2009, and entitled “Electronic Manifest of Underground Facility Locate Marks;”
  • U.S. publication no. 2009-0204625-A1, published Aug. 13, 2009, filed Feb. 5, 2009, and entitled “Electronic Manifest of Underground Facility Locate Operation;”
  • U.S. publication no. 2009-0204466-A1, published Aug. 13, 2009, filed Sep. 4, 2008, and entitled “Ticket Approval System For and Method of Performing Quality Control In Field Service Applications;”
  • U.S. publication no. 2009-0207019-A1, published Aug. 20, 2009, filed Apr. 30, 2009, and entitled “Ticket Approval System For and Method of Performing Quality Control In Field Service Applications;”
  • U.S. publication no. 2009-0210284-A1, published Aug. 20, 2009, filed Apr. 30, 2009, and entitled “Ticket Approval System For and Method of Performing Quality Control In Field Service Applications;”
  • U.S. publication no. 2009-0210297-A1, published Aug. 20, 2009, filed Apr. 30, 2009, and entitled “Ticket Approval System For and Method of Performing Quality Control In Field Service Applications;”
  • U.S. publication no. 2009-0210298-A1, published Aug. 20, 2009, filed Apr. 30, 2009, and entitled “Ticket Approval System For and Method of Performing Quality Control In Field Service Applications;”
  • U.S. publication no. 2009-0210285-A1, published Aug. 20, 2009, filed Apr. 30, 2009, and entitled “Ticket Approval System For and Method of Performing Quality Control In Field Service Applications;”
  • U.S. publication no. 2009-0204238-A1, published Aug. 13, 2009, filed Feb. 2, 2009, and entitled “Electronically Controlled Marking Apparatus and Methods;”
  • U.S. publication no. 2009-0208642-A1, published Aug. 20, 2009, filed Feb. 2, 2009, and entitled “Marking Apparatus and Methods For Creating an Electronic Record of Marking Operations;”
  • U.S. publication no. 2009-0210098-A1, published Aug. 20, 2009, filed Feb. 2, 2009, and entitled “Marking Apparatus and Methods For Creating an Electronic Record of Marking Apparatus Operations;”
  • U.S. publication no. 2009-0201178-A1, published Aug. 13, 2009, filed Feb. 2, 2009, and entitled “Methods For Evaluating Operation of Marking Apparatus;”
  • U.S. publication no. 2009-0202112-A1, published Aug. 13, 2009, filed Feb. 11, 2009, and entitled “Searchable Electronic Records of Underground Facility Locate Marking Operations;”
  • U.S. publication no. 2009-0204614-A1, published Aug. 13, 2009, filed Feb. 11, 2009, and entitled “Searchable Electronic Records of Underground Facility Locate Marking Operations;”
  • U.S. publication no. 2009-0238414-A1, published Sep. 24, 2009, filed Mar. 18, 2008, and entitled “Virtual White Lines for Delimiting Planned Excavation Sites;”
  • U.S. publication no. 2009-0241045-A1, published Sep. 24, 2009, filed Sep. 26, 2008, and entitled “Virtual White Lines for Delimiting Planned Excavation Sites;”
  • U.S. publication no. 2009-0238415-A1, published Sep. 24, 2009, filed Sep. 26, 2008, and entitled “Virtual White Lines for Delimiting Planned Excavation Sites;”
  • U.S. publication no. 2009-0241046-A1, published Sep. 24, 2009, filed Jan. 16, 2009, and entitled “Virtual White Lines for Delimiting Planned Excavation Sites;”
  • U.S. publication no. 2009-0238416-A1, published Sep. 24, 2009, filed Jan. 16, 2009, and entitled “Virtual White Lines for Delimiting Planned Excavation Sites;”
  • U.S. publication no. 2009-0237408-A1, published Sep. 24, 2009, filed Jan. 16, 2009, and entitled “Virtual White Lines for Delimiting Planned Excavation Sites;”
  • U.S. publication no. 2009-0238417-A1, published Sep. 24, 2009, filed Feb. 6, 2009, and entitled “Virtual White Lines for Indicating Planned Excavation Sites on Electronic Images;”
  • U.S. publication no. 2009-0327024-A1, published Dec. 31, 2009, filed Jun. 26, 2009, and entitled “Methods and Apparatus for Quality Assessment of a Field Service Operation;”
  • U.S. publication no. 2010-0010862-A1, published Jan. 14, 2010, filed Aug. 7, 2009, and entitled “Methods and Apparatus for Quality Assessment of a Field Service Operation Based on Geographic Location;”
  • U.S. publication no. 2010-0010863-A1, published Jan. 14, 2010, filed Aug. 7, 2009, and entitled “Methods and Apparatus for Quality Assessment of a Field Service Operation Based on Multiple Scoring Categories;”
  • U.S. publication no. 2010-0010882-A1, published Jan. 14, 2010, filed Aug. 7, 2009, and entitled “Methods and Apparatus for Quality Assessment of a Field Service Operation Based on Dynamic Assessment Parameters;” and
  • U.S. publication no. 2010-0010883-A1, published Jan. 14, 2010, filed Aug. 7, 2009, and entitled “Methods and Apparatus for Facilitating a Quality Assessment of a Field Service Operation Based on Multiple Quality Assessment Criteria.”
  • It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention.
  • FIG. 1 illustrates a functional block diagram of a geo-referenced electronic drawing application for documenting and reporting events, according to the present disclosure;
  • FIG. 2 illustrates an example of a drawing tool GUI of the geo-referenced electronic drawing application, according to the present disclosure;
  • FIG. 3 illustrates an example of a series of geo-referenced drawings that are generated using the geo-referenced electronic drawing application, according to the present disclosure;
  • FIG. 4 illustrates an example of a report that is generated using the geo-referenced electronic drawing application and that includes a geo-referenced drawing, according to the present disclosure;
  • FIG. 5 illustrates a flow diagram of an example of a method of operation of the geo-referenced electronic drawing application, according to the present disclosure;
  • FIG. 6 illustrates a functional block diagram of a networked system that includes the geo-referenced electronic drawing application for documenting and reporting events, according to the present disclosure;
  • FIG. 7 shows a map, representing an exemplary input image;
  • FIG. 8 shows a construction/engineering drawing, representing an exemplary input image;
  • FIG. 9 shows a land survey map, representing an exemplary input image;
  • FIG. 10 shows a grid, overlaid on the construction/engineering drawing of FIG. 8, representing an exemplary input image;
  • FIG. 11 shows a street level image, representing an exemplary input image;
  • FIG. 12 shows the drawing tool GUI of FIG. 2 displaying a layer directory pane that facilitates the manipulation of layers;
  • FIG. 13 shows the drawing tool GUI of FIG. 2 displaying an animation controls window that facilitates generation of an animated sequence;
  • FIG. 14 shows an illustrative computer that may be used at least in part to implement the geo-referenced electronic drawing application in accordance with some embodiments; and
  • FIG. 15 shows an example of an input image constructed from bare data.
  • DETAILED DESCRIPTION
  • Following below are more detailed descriptions of various concepts related to, and embodiments of, inventive methods, apparatus and systems according to the present disclosure for facilitating documentation of events (e.g., an incident, such as a motor vehicle accident) via one or more time-elapsed geo-referenced electronic drawings. It should be appreciated that various concepts introduced above and discussed in greater detail below may be implemented in any of numerous ways, as the disclosed concepts are not limited to any particular manner of implementation. Examples of specific implementations and applications are provided primarily for illustrative purposes.
  • A geo-referenced electronic drawing application for documenting and reporting events is described herein. The geo-referenced electronic drawing application may provide a mechanism for importing a geo-referenced image that may be marked up with symbols and/or any other markings for indicating the details of an event, such as a vehicle accident. The geo-referenced image may include data associated therewith (e.g., embedded metadata) that allows identification of locational information (e.g., locational coordinates) for any point or region on the image. Further, the geo-referenced electronic drawing application may provide a mechanism for generating a report of the event that includes the marked up geo-referenced image. A networked system that includes the geo-referenced electronic drawing application is also described.
  • It should be appreciated that while the imported or otherwise acquired image is described herein as “geo-referenced,” and the drawing application is likewise described as geo-referenced, the image need not be geo-referenced unless required for a particular implementation and the drawing application may be used for non geo-referenced images. In many instances, an image that is not geo-referenced may be suitably used. Examples of non geo-referenced images that may be suitable in various scenarios are: a stock or generic image of an intersection, a stock or generic image of an room, a stock or generic image of a street, and a photograph taken during investigation of an incident or generation of a report on the incident. Of course, these are merely exemplary, as many other types of non geo-referenced images are possible.
  • Further, while certain embodiments may be described in the context of generating a vehicle accident report, this is exemplary only. The geo-referenced electronic drawing application described herein is suitable for generating any type of report in which a geo-referenced image (or other image) may be useful, such as, but not limited to, personal injury reports, vehicle accident reports, any types of property damage reports, and the like. For example, the methods and apparatus described herein may be useful for providing reports that include images in various field service applications, such as, but not limited to, those of underground facilities locate companies, excavation companies, landscaping companies, tree care and removal companies, utility installation and repair companies, and the like.
  • The geo-referenced electronic drawing application described herein may provide the ability to electronically mark up real world geo-referenced images with symbols, shapes, and/or lines in order to provide improved and consistent accuracy with respect to drawings that support incident reports.
  • In addition, the geo-referenced electronic drawing application described herein may provide the ability to electronically mark up real world geo-referenced images with symbols, shapes, and/or lines to scale, again providing improved and consistent accuracy with respect to drawings that support incident reports.
  • Further, the geo-referenced electronic drawing application may provide a standard symbols library, thereby providing standardization with respect to drawings that support incident reports.
  • Networked systems that include the geo-referenced electronic drawing application described herein may provide improved distribution, tracking, and auditing of reports among entities and the systems provide improved control over access to reports.
  • Referring to FIG. 1, a functional block diagram of a geo-referenced electronic drawing application 100 for documenting and reporting events is presented. Geo-referenced electronic drawing application 100 may be a standalone and/or a web-based software application that allows a user to import a geo-referenced image and then mark up the image with symbols and/or any other markings for indicating the details of an event, such as a vehicle accident.
  • Geo-referenced electronic drawing application 100 may be executed by a processing unit 110 and stored in memory 112. Processing unit 110 may be any standard microcontroller or microprocessor device that is capable of executing program instructions of geo-referenced electronic drawing application 100. Memory 112 may be any standard data storage medium. In one example, a symbols library 114, a collection of input images 116, certain geo-location data 118, and timestamp data 120, may be stored in memory 112.
  • Timestamp data 120 may include calendar date and/or time information. Timestamp data 120 may originate from, for example, the computing device on which geo-referenced electronic drawing application 100 is installed, any other computing device, and/or manual entry by the user.
  • Location stamp data 150 may include location information such as a city and state, zip code, or geographic coordinates. Location stamp data 150 may originate from, for example, the computing device on which geo-referenced electronic drawing application 100 is installed, any other computing device, and/or manual entry by the user.
  • Symbols library 114, input images 116, geo-location data 118, timestamp data 120 and location stamp data 150 support the functions of a drawing tool graphical user interface (GUI) 122 of geo-referenced electronic drawing application 100. Drawing tool GUI 122 is suitable for presenting on the display of any computing device, such as a computing device 140. By reading geographic location information from geo-location data 118 and/or by processing geographic location information that may be manually entered, processing unit 110 retrieves a certain input image 116 that corresponds to the geographic location information and displays the input image 116 in a window of drawing tool GUI 122. Geographic location information may be, for example, a physical address, latitude and longitude coordinates, and/or any global positioning system (GPS) data.
  • For purposes of the present disclosure, an input image 116 is any image represented by source data that is electronically processed (e.g., the source data is in a computer-readable format) to display the image on a display device. An input image 116 may include any of a variety of paper/tangible image sources that are scanned (e.g., via an electronic scanner) or otherwise converted so as to create source data (e.g., in various formats such as XML, PDF, JPG, BMP, etc.) that can be processed to display the input image 116. An input image 116 also may include an image that originates as source data or an electronic file without necessarily having a corresponding paper/tangible copy of the image (e.g., an image of a “real-world” scene acquired by a digital still frame or video camera or other image acquisition device, in which the source data, at least in part, represents pixel information from the image acquisition device).
  • In some exemplary implementations, input images 116 according to the present disclosure may be created, provided, and/or processed by a geographic information system (GIS) that captures, stores, analyzes, manages and presents data referring to (or linked to) location, such that the source data representing the input image 116 includes pixel information from an image acquisition device (corresponding to an acquired “real world” scene or representation thereof), and/or spatial/geographic information (“geo-encoded information”).
  • In some exemplary implementations, one or more input images 116 may be stored in local memory 112 of the computing device 140 and/or retrieved from the optional remote computer (e.g., via the communication interface 124) and then stored in local memory. Various information may be derived from the one or more input images for display (e.g., all or a portion of the input image, metadata associated with the input image, etc.).
  • In view of the foregoing, various examples of input images and source data representing input images 116 according to the present disclosure, to which the inventive concepts disclosed herein may be applied, include but are not limited to:
      • Various maps, such as street/road maps (e.g., map 700 of FIG. 7), topographical maps, military maps, parcel maps, tax maps, town and county planning maps, virtual maps, etc. (such maps may or may not include geo-encoded information). Such maps may be scaled to a level appropriate for the application;
      • Architectural, construction and/or engineering drawings and virtual renditions of a space/geographic area (including “as built” or post-construction drawings). Such drawings/renditions may be useful, e.g., in property damage report applications or for documenting construction, landscaping or maintenance. An exemplary construction/engineering drawing 800 is shown in FIG. 8;
      • Land surveys, i.e., plots produced at ground level using references to known points such as the center line of a street to plot the metes and bounds and related location data regarding a building, parcel, utility, roadway, or other object or installation. Land survey images may be useful, e.g., in vehicular incident report applications or police report applications. FIG. 9 shows an exemplary land survey map 900;
      • A grid (a pattern of horizontal and vertical lines used as a reference) to provide representational geographic information (which may be used “as is” for an input image or as an overlay for an acquired “real world” scene, drawing, map, etc.). An exemplary grid 1000, overlaid on construction/engineering drawing 800, is shown in FIG. 10. It should be appreciated that the grid 1000 may itself serve as the input image (i.e., a “bare” grid), or be used together with another underlying input image;
      • “Bare” data representing geo-encoded information (geographical data points) and not necessarily derived from an acquired/captured real-world scene (e.g., not pixel information from a digital camera or other digital image acquisition device). Such “bare” data may be nonetheless used to construct a displayed input image, and may be in any of a variety of computer-readable formats, including XML).
      • One example of bare data is geo-referenced data relating to municipal assets. Databases exist that include geo-location information (e.g., latitude and longitude coordinates) and attribute information (e.g., sign type) for municipal assets such as signs, crash attenuators, parking meters, barricades, and guardrails. Such a database may be used in connection with an asset management system, such as the Infor EAM (Enterprise Asset Management) system by Infor Global Solutions of Alpharetta, Ga., to manage municipal assets. Using bare data relating to municipal assets, a geo-encoded image may be constructed that includes representations of municipal assets at their relative locations. In particular, the attribute information may be used to select a symbol representing the asset in the image, and the geo-location information may be used to determine the placement of the symbol in the image.
      • Other examples of bare data are geo-referenced data relating to weather and geo-referenced data relating to traffic. Both weather and traffic data are available from various sources in Geographic Information System (GIS) format. For example, a set of points, lines, and/or regions in a spatial database may represent locations or areas having a particular traffic attribute (e.g., heavy traffic, construction, moderate congestion, minor stall, normal speeds) or a particular weather attribute (e.g., heavy snow, rain, hail, fog, lightning, clear skies). The data in the database may be dynamic, such that the points, lines, and/or regions and corresponding attributes change as the traffic and weather conditions change. Using bare data relating to traffic and/or weather, a geo-encoded image may be constructed that includes representations of traffic and/or weather conditions at their relative locations. In particular, the attribute information may be used to select a symbol, pattern, and/or color representing the traffic or weather condition in the image, and the geo-location information may be used to determine the placement of the symbol, pattern and/or color in the image. An example of a source for GIS traffic data is NAVIGATOR, the Georgia Department of Transportation's Intelligent Transportation System (ITS). GIS weather data is available from the National Weather Service (NWS). Such weather data may be provided as shapefiles, which is a format for storing geographic information and associated attribute information. Shapefiles may include information relating to weather warnings (e.g., tornado, severe thunderstorm, and flash flood warnings) and the like.
  • FIG. 15 shows an example of an input image 1500 constructed from bare data. In particular, input image 1500 includes a representation of a street sign 1510, representations of traffic conditions 1512 and 1514, and a representation of a weather condition 1516. The location of the street sign representation 1510 and traffic condition representations 1512 and 1514 may correspond to the actual locations of the street signs and traffic conditions in the region shown in the input image 1500. The location of the representation of the weather condition 1516 may be arbitrarily selected, or selected to be in a corner of the input image 1500, as the representation may indicate that the weather condition corresponds generally to the entire region shown in the input image 1500. Each of the representations shown in FIG. 15 is based on geo-location information (e.g., latitude and longitude coordinates) and attribute information (e.g., a sign type, traffic conditions, and a weather condition). In the example shown, the type of street sign 1510 is a stop sign, the traffic conditions 1512 and 1514 are “construction” and “light traffic,” and the weather condition 1516 is lightning; and
      • Photographic renderings/images, including street level (see e.g., street level image 1100 of FIG. 11), topographical, satellite, and aerial photographic renderings/images, any of which may be updated periodically to capture changes in a given geographic area over time (e.g., seasonal changes such as foliage density, which may variably impact the ability to see some aspects of the image). Such photographic renderings/images may be useful, e.g., in connection with preparing property damage reports, vehicular incident reports, police reports, etc.
  • It should also be appreciated that source data representing an input image 116 may be compiled from multiple data/information sources; for example, any two or more of the examples provided above for input images and source data representing input images 116, or any two or more other data sources, can provide information that can be combined or integrated to form source data that is electronically processed to display an image on a display device.
  • Referring to FIG. 2, an example of a drawing tool GUI 122 of geo-referenced electronic drawing application 100 is presented. In the case of a web-based application, drawing tool GUI 122 that may be implemented, for example, by a web browser that is presented via any networked computing device, such as computing device 140 of FIG. 1. In the case of a standalone application, drawing tool GUI 122 that may be implemented, for example, by a GUI window that is presented via any computing device.
  • Drawing tool GUI 122 may present a certain input image 116 that corresponds to specified geographic location information. For example, location information from geo-location data 118 may be automatically read into an address field 210 and/or a geo-location data field 212. Alternatively, location information may be manually entered in address field 210 and/or geo-location data field 212. In one example, input image 116 may be an aerial image that corresponds to the geographic location information. Overlaying input image 116 may be an image scale 214. Input image 116 is read into drawing tool GUI 122 and may be oriented in the proper manner with respect to directional heading (i.e., north, south, east, and west).
  • Drawing tool GUI 122 may also include various palettes, toolbars, or other interfaces that enable the user to manipulate (e.g., zoom in, zoom out) and/or mark up input image 116. For example, drawing tool GUI 122 may include a drawing toolbar 216 that may include a sketching palette as well as a symbols palette. The sketching palette portion of drawing toolbar 216 may provide standard drawing tools that allow a user to draw certain shapes (e.g., a polygon, a rectangle, a circle, a line) atop input image 116. The symbols palette portion of drawing toolbar 216 provides a collection of any symbols that may be useful for depicting the event of interest, such as a vehicle accident. The source of these symbols may be symbols library 114. For example, symbols library 114 may include, but is not limited to, a collection of car symbols, truck symbols, other vehicle symbols (e.g., emergency vehicles, buses, farm equipment, 2-wheel vehicles, etc), landmark symbols (e.g., fire hydrants, trees, fences, poles, cross walks, various barriers, etc), symbols of signs (e.g., standard road signs, any other signs, etc), symbols of people (e.g., pedestrians), symbols of animals, and the like. By use of the elements of drawing toolbar 216, a user may mark up input image 116 in a manner that depicts, for example, the vehicle accident scene. In one example and referring to FIG. 2, a vehicle collision is depicted by a vehicle # 1 and a vehicle # 2 overlaid on input image 116. The symbols for vehicle # 1 and vehicle # 2 are selected from the symbols palette portion of drawing toolbar 216.
  • Optionally, the drawing tool GUI 122 may allow a user to specify a confidence level for a selected symbol. For example, if a user selects a symbol corresponding to a bus to be overlaid on input image 116, the user may specify an associated confidence level to indicate a degree of confidence that the observed vehicle was a bus. The confidence level may be numeric, e.g., “25%,” or descriptive, e.g., “low.” An indication of the confidence level or a degree of uncertainty may be displayed adjacent the corresponding symbol or may be integrated with the symbol itself. For example, a question mark or the confidence level may be displayed on or near the symbol. Additionally or alternatively, an indication of the confidence level may be included in the text of a vehicle accident report including the marked up input image.
  • The aforementioned palettes, toolbars, and/or symbols library are described in the context of preparing a vehicle accident report. However, this is exemplary only. The palettes, toolbars, and/or symbols library of the geo-referenced electronic drawing application of the present disclosure may be industry-specific and/or incident type-specific. As a result, the palettes, toolbars, and/or symbols library may be selectable by the user depending on the application in which the geo-referenced electronic drawing application is being used. In one example, with respect to an incident involving tree damage and/or a tree damaging a structure, the user may select palettes, toolbars, and/or symbols that include trees and building rooflines that may be used for marking up the geo-referenced image.
  • Additionally, geo-referenced electronic drawing application 100 may be designed to automatically render symbols to scale upon the geo-referenced drawing according to the settings of scale 214. This is one example of how geo-referenced electronic drawing application 100 may provide consistent accuracy to drawings that support incident reports. Further, the presence of a standard symbols library, such as symbols library 114, is one example of how geo-referenced electronic drawing application 100 provides standardization to drawings that support incident reports.
  • The geo-referenced electronic drawing application 100 may be configured to allow the viewing angle or perspective of the input image 116 and/or representations thereon to be changed. For example, a user may switch between an overhead view, a perspective view, and a side view. This may be accomplished by correlating corresponding points in two or more geo-referenced images, for example. A symbol, such as a representation of a vehicle, or other content-related marking added to an image may have three-dimensional data associated therewith to enable the symbol to be viewed from different angles. Thus, while a viewing angle or perspective of an image may change, its content (e.g., a representation of a vehicle accident and its surrounding) may remain the same.
  • Further, the geo-referenced electronic drawing application 100 may be configured to allow the input image 116 to be manually or automatically modified. For example, it may be desirable to remove extraneous features, such as cars, from the input image 116. The geo-referenced electronic drawing application 100 may include shape or object recognition software that allows such features to be identified and/or removed. One example of software capable of recognizing features in an image, such as an aerial image, is ENVI® image processing and analysis software by ITT Corporation of White Plains, N.Y. Exemplary features that may be recognized include vehicles, buildings, roads, bridges, rivers, lakes, and fields. The geo-referenced electronic drawing application 100 may be configured such that a value indicating a level of confidence that an identified object corresponds to a particular feature may optionally be displayed. Automatically identified features may be automatically modified in the image in some manner. For example, the features may be blurred or colored (e.g., white, black or to resemble a color of one or more pixels adjacent the feature). Additionally, or alternatively, the geo-referenced electronic drawing application 100 may include drawing tools (e.g., an eraser tool or copy and paste tool), that allow such features to be removed, concealed, or otherwise modified after being visually recognized by a user or automatically recognized by the geo-referenced electronic drawing application 100 or associated software.
  • Drawing toolbar 216 may also allow the user to add text boxes that can be used to add textual content to input image 116. In one example, a callout 218 may be one mechanism for entering and displaying textual information about, in this example, the vehicle collision.
  • Further, drawing tool GUI 122 may include a navigation toolbar 220 by which the user may zoom or pan input image 116 (e.g., zoom in, zoom out, zoom to, pan, pan left, pan right, pan up, pan down, etc.). Navigation toolbar 220 may additionally include one or more buttons that enable user drawn shapes to be accentuated (e.g., grayscale, transparency, etc.). Additionally, a set of scroll controls 222 may be provided in the image display window that allows the user to scroll input image 116 north, south, east, west, and so on with respect to real world directional heading. In addition, the drawing application may be configured to reposition the displayed image so that it is directionally aligned with a direction of the display screen, based on an input from a compass or other device indicative of an orientation of the display screen in the environment.
  • Overlaying input image 116 may also be a timestamp 224 and/or a location stamp 250. Timestamp 224 may indicate the creation date and/or time or a save date and/or time of a marked up input image 116 or information used to generate the marked up input image. Timestamp data 120 in memory 112 of FIG. 1 may be the source of information of timestamp 224. Such data may be based on an output of a local or remote timer, for example. Location stamp 250 may indicate the location where the marked up input image 116 or information used to generate the marked up input image was saved. Location stamp 250 in memory 112 of FIG. 1 may be the source of information of location stamp 250. Such data may be based on an output of GPS device, for example.
  • The timestamp 224 and location stamp 250 may be automatically generated based, for example, on the output of a timer device and GPS device as discussed above. Further, the timestamp and location stamp may be difficult or impossible for a user to modify. Thus, the timestamp and location stamp may be used to verify that the marked-up input image with which they are associated was created at an expected time and place, such as the general or specific time and place where the vehicular accident or other incident was investigated. If desired, time and/or location data may be automatically acquired several times during the creation of one or more marked-up digital images, and may be stored in association with the images, to enable verification that the user was present at the time and/or place of the investigation for some duration of time.
  • The ability to read in and electronically mark up real world geo-referenced images, such as input images 116, with symbols, shapes, and/or lines is one example of how geo-referenced electronic drawing application 100 may provide improved and consistent accuracy to drawings that support incident reports.
  • In some embodiments, the input image data and the mark up data (e.g., the electronic representations of the vehicles, landmarks and/or signs), may be displayed as separate “layers” of the visual rendering, such that a viewer of the visual rendering may turn on and turn off displayed data based on a categorization of the displayed data. Respective layers may be enabled or disabled for display in any of a variety of manners. According to one exemplary implementation shown in FIG. 12, a “layer directory” or “layer legend” pane 1200 may be rendered in the viewing window of drawing tool GUI 122 described in connection with FIG. 2. The layer directory pane 1200 may show all available layers, and allow a viewer to select each available layer to be either displayed or hidden, thus facilitating comparative viewing of layers. The layer directory pane 1200 may be displayed by selecting a “display layer directory pane” action item in the layers menu 1202.
  • In the example of FIG. 12, image information is categorized generally under layer designation 1202 (“reference layer”) and may be independently enabled or disabled for display (e.g., hidden) by selecting the corresponding check box. Similarly, information available to be overlaid on the input image is categorized generally under layer designation 1206 (“symbols layer”) and may be independently enabled or disabled for display by selecting the corresponding check box.
  • The reference layer and symbols layers may have sub-categories for sub-layers, such that each sub-layer may also be selectively enabled or disabled for viewing by a viewer. For example, under the general layer designation 1202 of “reference layer,” a “base image” sub-layer may be selected for display. The base image sub-layer is merely one example of a sub-layer that may be included under the “reference layer,” as other sub-layers (e.g., “grid”) are possible. Under the general layer designation 1206 of “symbols layer,” different symbol types that may be overlaid on the input image may be categorized under different sub-layer designations (e.g., designation 1208 for “cars layer;” designation 1212 for “trucks layer;” designation 1216 for “other vehicles layer;” designation 1218 for “landmarks layer;” and designation 1220 for “signs layer”). In this manner, a viewer may be able to display certain symbols information (e.g., concerning cars and trucks), while hiding other symbols information (e.g., concerning other vehicles, landmarks and signs).
  • Further, the various sub-layers may have further sub-categories for sub-layers, such that particular features within a sub-layer may also be selectively enabled or disabled for viewing by a viewer. For example, the cars layer may include a designation 1210 for “car 1,” and the truck layer may include a designation 1214 for “truck 1.” Thus, information concerning the car 1222 (“car 1”) and truck 1224 (“truck 1”) involved in the accident can be selected for display.
  • As shown in the example of FIG. 12, both the reference and symbols layers are enabled for display. Under the reference layer, the base image layer is enabled for display. Amongst the symbols layer sub-layers, only the cars layer and the trucks layer are enabled for display. Amongst these sub-layers, the further sub-layers “car 1” and “truck 1” are enabled for display. Accordingly, a base image is rendered in the viewing window of drawing tool GUI 122, and only car 1222 and truck 1224 are rendered thereon.
  • Virtually any characteristic of the information available for display may serve to categorize the information for purposes of display layers or sub-layers. In particular, any of the various exemplary elements that may be rendered using the drawing tool GUI 122 discussed herein (e.g., timestamps; scales; callouts; estimated time information; input image content; symbols relating to vehicles, landmarks, signs, people, animals or the like, etc.) may be categorized as a sub-layer, and one or more sub-layers may further be categorized into constituent elements for selective display (e.g., as sub-sub-layers).
  • Further, layers may be based on user-defined attributes of symbols or other rendered features. For example, a layer may be based on the speed of vehicles, whether vehicles were involved in the accident, whether the vehicles are public service vehicles, the location of vehicles at a particular time, and so on. For example, a user may define particular vehicle symbols as having corresponding speeds, and a “moving vehicles layer” may be selected to enable the display of vehicles having non-zero speeds. Additionally or alternatively, selecting the moving vehicles layer may cause information concerning the speed of the moving vehicles to be displayed. For example, text indicating a speed of 15 mph may be displayed adjacent a corresponding vehicle. Similarly, a user may define particular vehicle symbols as being involved in the accident, and an “accident vehicles layer” may be selected to enable the display of vehicles involved in the accident. Additionally or alternatively, selecting the accident vehicles layer may cause information identifying accident vehicles to be displayed. For example, an icon indicative of an accident vehicle may be displayed adjacent a corresponding vehicle. The “moving vehicles layer” and the “accident vehicles” layer may be sub-layers under the symbols layer, or may be sub-layers under a “vehicle layer” (not shown), which itself is a sub-layer under the symbols layer. Further, the “moving vehicles layer” and the “accident vehicles layer” may in turn include sub-layers. For example, the “moving vehicles layer” may include a sub-layer to enable the display of all vehicles traveling east. From the foregoing, it may be appreciated that a wide variety of information may be categorized in a nested hierarchy of layers, and information included in the layers may be visually rendered, when selected/enabled for display, in a variety of manners.
  • Other attributes of symbols or other rendered features may also be used as the basis for defining layers. For example, the user-determined and/or automatically determined confidence levels of respective symbols, as discussed herein, may be used as the basis for defining layers. According to one illustrative example, a layer may be defined to include only those symbols that have an associated user-determined and/or automatically determined confidence level of at least some percentage, e.g., 50%. The information concerning the confidence levels associated with the symbols may be drawn from a report in which such levels are included.
  • It should further be appreciated that, according to various embodiments, the attributes and/or type of visual information displayed as a result of selecting one or more layers or sub-layers is not limited. In particular, visual information corresponding to a selected layer or sub-layer may be electronically rendered in the form of one or more lines or shapes (of various colors, shadings and/or line types), text, graphics (e.g., symbols or icons), and/or images, for example. Likewise, the visual information corresponding to a selected layer or sub-layer may include multiple forms of visual information (one or more of lines, shapes, text, graphics and/or images).
  • In yet other embodiments, all of the symbols and/or other overlaid information of a particular marked up input image may be categorized as a display layer, such that the overlaid information may be selectively enabled or disabled for display as a display layer. In this manner, a user may conveniently toggle between the display of various related marked up input images (e.g., marked up input images relating to the same accident or other event) for comparative display. In particular, a user may toggle between scenes depicting the events of an accident at different times.
  • It should be appreciated that a layer need not include a singular category of symbols or overlaid information, and may be customized according to a user's preferences. For example, a user may select particular features in one or more marked up input images that the user would like to enable to be displayed collectively as a layer. Additionally or alternatively, the user may select a plurality of categories of features that the user would like to enable to be displayed collectively as a layer.
  • In some embodiments, processing unit 110 (FIG. 1) may automatically select which layers are displayed or hidden. As an example, if a user depicts a truck in the accident scene using a truck symbol, processing unit 110 may automatically select the “truck layer” sub-layer and the “truck 1” sub-sub layer for display in the display field. As another example, if a user specifies or selects landmarks to be displayed, processing unit 110 may automatically select the base image to be hidden to provide an uncluttered depiction of an accident scene. The foregoing are merely illustrative examples of automatic selection/enabling of layers, and the inventive concepts discussed herein are not limited in these respects.
  • Referring to FIGS. 1 and 2, when the user has completed marking up (e.g., with lines, shapes, symbols, text, etc.) the certain input image 116, the marked up input image 116 may be saved as an event-specific image 126. For example, during the save operation of geo-referenced electronic drawing application 100, any event-specific images 126 created therein may be converted to any standard digital image file format, such as PDF, JPG, and BMP file format, and saved, for example, in memory 112 or to an associated file system (not shown). In some cases, it may be beneficial for the user to generate multiple event-specific images 126 in order to depict, for example, more details of how a vehicle accident occurred. The multiple event-specific images 126 may be associated to one another via, for example, respective descriptor files 128 and saved as an image series 130. An example of an image series 130 is shown with reference to FIG. 3.
  • Each descriptor file 128 includes information about each event-specific image 126 of an image series 130. Using the example of a vehicle accident report, each descriptor file 128 may include the accident report number, the name of the event-specific image 126 with respect to the image series 130, the creation date, and the like. Descriptor files 128 provide a mechanism of geo-referenced electronic drawing application 100 that allow event-specific images 126 and/or any image series 130 to be queried by other applications, such as any incident management applications. In one example, descriptor files 128 may be extensible markup language (XML) files that are created during the save process of event-specific images 126 and/or image series 130.
  • Referring to FIG. 3, an example of a series of geo-referenced drawings that are generated using geo-referenced electronic drawing application 100 is presented. FIG. 3 shows an example of an image series 130 that depicts time-lapsed sequential images of a vehicle collision (i.e., essentially representing time-lapsed frames 1, 2, and 3 in sequence). In this example, frame 1 is represented by an event-specific image 126A that depicts vehicle # 1 heading westbound and vehicle # 2 heading eastbound, just prior to the collision. Frame 2 is represented by an event-specific image 126B that depicts vehicle # 1 and vehicle # 2 at the moment of impact during the collision. Frame 3 is represented by an event-specific image 126C that depicts the final resting place of vehicle # 1 and vehicle # 2 after the collision.
  • Each of the event-specific images 126A-C may include a corresponding estimated relative time 225A-C represented thereon. The estimated relative time may reflect an estimated time of the event (e.g., a vehicle accident) depicted in the event-specific image. In the example of FIG. 3, an estimated relative time is rendered visually (e.g., overlaid) on the input image 116 of each of the event-specific images 126A-C. In event-specific image 126A, the estimated relative time 225A is represented by a variable “t,” which corresponds to an unknown time. In event- specific images 126B and 126C, the estimated relative times 225B and 225C are represented by times relative to the variable “t” (i.e., “t+0.5 sec” and “t+1 sec,” respectively). Additionally or alternatively, the estimated relative time may reflect an estimated date of the vehicle accident. As also shown in FIG. 3, one or more of event-specific images 126A-C may include a corresponding estimated actual time 227 represented thereon. The estimated actual time may reflect an estimated non-relative time of the vehicle accident. The estimated relative time 225A-C and the estimated actual time 227 may be estimated by the user of the drawing application or a related party.
  • In some embodiments, it may be desirable to generate an animated sequence based on a plurality of event-specific images 126. According to one exemplary implementation shown in FIG. 13, an animation controls window 1302 may be rendered in the viewing window of drawing tool GUI 122 described in connection with FIG. 2 to facilitate generation of an animated sequence. The animation controls window 1302 may be displayed by selecting a “display animation controls” action item in the animation menu 1300.
  • The animation controls window 1302 comprises an interface 1304 for specifying frame order, an interface 1306 for specifying animation speed, and an interface 1308 for specifying a transition between frames. In the example of FIG. 13, interface 1304 lists each of the frames representing event-specific images. A user may specify a sequential order for the listed frames by selecting up or down arrows associated with the listed frames. For example, a user may select the down arrow associated with “Frame 1” to move this frame to a later sequential order.
  • Interface 1304 lists options for specifying the animation speed of the frames. A first option, which is selected in the example of FIG. 13, provides that the animation speed of the frames will be based on an estimated time for each frame. In particular, by selecting this option, the animation speed may be based on an estimated relative time or an estimated actual time that may be specified for each frame as discussed in connection with FIG. 3. For example, if Frame 2 has an estimated relative time that is two seconds after that of Frame 1, and Frame 2 has an estimated relative time that is five seconds after that of Frame 2, the frames may be displayed at zero seconds, two seconds, and seven seconds, respectively, or at some multiplier thereof. For example, the frames may be displayed at half of the estimated actual speed by displaying the frames at zero seconds, four seconds, and fourteen seconds, respectively. A second option, which is unselected in the example of FIG. 13, provides that the animation speed of the frames will be based on a regular interval, the length of which may be adjusted by sliding the arrow associated with the interval length control feature to the left or the right.
  • It should be appreciated that the animation speed need not be consistent for all frames, and that the animation speed for particular sequences of frames may be adjusted as desired by the user. For example, the time associated with one or more frames may be increased or decreased relative to an estimated time so that the user can observe how such an increase or decrease impacts the animation and/or simulate different scenarios.
  • Interface 1306 lists options for specifying a transition between overlaid features in the frames (e.g., vehicle symbols). A first option, which is selected in the example of FIG. 13, provides that there is no transition between the overlaid features. In this case, the frames will simply be displayed sequentially as a time-lapse animation. A second option, which is unselected in the example of FIG. 13, provides that the overlaid features (e.g., vehicle symbols) will trace a path between their position in consecutive frames to transition between the event-specific images of consecutive frames. In the case of a vehicle, for example, the path may be a linear path between a first point representing a center of the vehicle symbol in one image and a second point representing a center of the vehicle symbol in a consecutive image. According to one exemplary implementation, all overlaid features or all overlaid features of a particular type (e.g., vehicle symbols) will, by default, be animated in this manner. According to another exemplary implementation, if the second option is selected, an additional interface may be displayed that allows a user to select which overlaid features are to be animated and which are to remain stationary. For example, a user may specify that only features belonging to a certain custom or non-custom layer be animated while all other features remain stationary. Conversely, a user may specify that only features belonging to a certain custom or non-custom layer shall remain stationary while all other features are animated. This additional interface may or may not be used in connection with default settings.
  • Referring to FIGS. 1, 2, and 3, geo-referenced electronic drawing application 100 provides a mechanism by which event-specific images 126 and/or any image series 130 or animation based thereon may be integrated into electronic reports, such as reports 132 of FIG. 1. Reports 132 may be any electronic reports in which geo-referenced electronic drawings may be useful, such as electronic personal injury reports, electronic vehicle accident reports, any types of electronic property damage reports, and the like. An example of a report 132 is shown with reference to FIG. 4.
  • Referring to FIG. 4, a traffic collision report 400 that is generated using geo-referenced electronic drawing application 100 and that includes a geo-referenced drawing is presented. Traffic collision report 400 is an example of a report 132. Traffic collision report 400 may be, for example, a report used by accident investigation companies, law enforcement agencies, and/or insurance companies.
  • In this example, a certain event-specific image 126 is read into a drawing field of traffic collision report 400. In this way, the certain event-specific image 126 is integrated into traffic collision report 400. The textual information of traffic collision report 400 may be manually entered and/or automatically imported from information associated with event-specific image 126, which was captured using drawing tool GUI 122. For example, a “Description of Accident” field may be populated with textual information of callout 218 (see FIG. 2). Additionally, an entry screen (not shown) of geo-referenced electronic drawing application 100 may be provided that allows the user to manually enter and/or modify information in the text fields of a report 132, such as the text fields of traffic collision report 400. The entry screen may be incorporated in and/or operate in combination with drawing tool GUI 122.
  • A report 132, such as traffic collision report 400, is not limited to incorporating a single event-specific image 126 only. For example, subsequent pages of traffic collision report 400 may include all event-specific images 126 of a certain image series 130, such as those shown in FIG. 3.
  • Referring to FIG. 5, a flow diagram of an example of a method 500 of operation of geo-referenced electronic drawing application 100 is presented. Method 500 may include, but is not limited to, the following steps, which are not limited to any order.
  • At step 510, by use of drawing tool GUI 122, processing unit 110 of geo-referenced electronic drawing application 100 acquires location information with respect to the event of interest. For example, geographic location information from geo-location data 118 may be automatically read into address field 210 and/or geo-location data field 212 of drawing tool GUI 122. Alternatively, location information may be manually entered in address field 210 and/or geo-location data field 212.
  • At step 512, the collection of geo-referenced images is queried, the matching geo-referenced image is read into drawing tool GUI 122, and the geo-referenced image is rendered in the viewing window of drawing tool GUI 122. For example, processing unit 110 of geo-referenced electronic drawing application 100 queries input images 116, which are the geo-referenced images, in order to find the input image 116 that matches the location information of step 510. Once the matching input image 116 is found, the input image 116 is read into drawing tool GUI 122 and rendered in the viewing window thereof. In this way, a geo-referenced image is provided to the user, upon which markings that indicate the event of interest may be made. In one example and referring to FIG. 2, an input image 116 that matches “263 Main St, Reno, Nev.” in address field 210 is located in the store of input images 116 in memory 112 and then read into drawing tool GUI 122.
  • At step 514, processing unit 110 of geo-referenced electronic drawing application 100 may process any symbols that are selected from symbols library 114 along with any other markings that are overlaid upon the geo-referenced image to depict the event of interest. For example, any symbols that are selected using drawing toolbar 216 of drawing tool GUI 122 may be overlaid upon the certain input image 116 in order to the depict event of interest, such as a vehicle accident. In one example and referring to FIG. 2, a symbol for a car (vehicle #1) and a symbol for a light truck (vehicle #2) are positioned and rendered upon the input image 116 that matches “263 Main St, Reno, Nev.” in address field 210. Additionally, geo-referenced electronic drawing application 100 is designed to automatically render symbols to scale upon the certain input image 116 according to the settings of scale 214.
  • Further, other markings (e.g., a polygon, a rectangle, a circle, a line) may be overlaid upon input image 116. In one example, using the sketching palette portion of drawing toolbar 216, lines to indicate skid marks may be drawn upon input image 116.
  • At step 516, processing unit 110 of geo-referenced electronic drawing application 100 may process any textual information related to the geo-referenced image. In one example and referring to FIG. 2, callout 218 may be used for entering and displaying textual information about the vehicle collision. Callout 218 is shown overlaid upon input image 116.
  • At step 518, processing unit 110 of geo-referenced electronic drawing application 100 may render and save the event-specific image along with its associated descriptor file. In one example when the user has completed marking up (e.g., with lines, shapes, symbols, text, etc.) the certain input image 116, the marked up input image 116 may be saved as an event-specific image 126. For example, during the save operation of geo-referenced electronic drawing application 100, any event-specific images 126 created therein may be converted to any standard digital image file format, such as PDF, JPG, and BMP file format, and saved. Further, its associated descriptor file 128 is created and saved.
  • At decision step 520, the user of geo-referenced electronic drawing application 100 determines whether an image series, such as the example image series 130 of FIG. 3, is required in order to adequately depict the event of interest. If yes, method 500 proceeds to step 522. If no, method 500 proceeds to step 526.
  • At decision step 522, the user of geo-referenced electronic drawing application 100 determines whether the image series is complete. If yes, method 500 proceeds to step 524. If no, method 500 returns to step 510 to begin creating the next event-specific image.
  • At step 524, the descriptor files 128 of the event-specific images 126 that are included in the image series 130 are associated and the image series 130 is saved.
  • At step 526, the event-specific image 126 and/or all event-specific images 126 of the image series 130 and any other information are integrated into the electronic report of interest. In one example, a certain event-specific image 126 is integrated into a certain type of report 132, such as traffic collision report 400 of FIG. 4. Further, textual information associated with event-specific image 126 may be automatically imported into traffic collision report 400.
  • Referring to FIG. 6, a functional block diagram of a networked system 600 that includes geo-referenced electronic drawing application 100 for documenting and reporting events is presented. In this embodiment, geo-referenced electronic drawing application 100 may be a web-based application. Therefore, networked system 600 may include an application server 610 upon which geo-referenced electronic drawing application 100 is installed.
  • Application server 610 may be any application server, such as a web application server and/or web portal, by which one or more user 612 may access geo-referenced electronic drawing application 100 with respect to documenting and reporting events. Application server 610 may be accessed by users 612 via any networked computing device, such as his/her local computing device 140. In one example, users 612 may be any personnel associated with accident investigation companies, law enforcement agencies, and/or insurance companies.
  • Networked system 600 of the present disclosure may further include an image server 614, which is one example of an entity supplying input images 116 of FIG. 1. Image server 614 may be any computer device for storing and providing input images 116, such as aerial images of geographic locations.
  • Networked system 600 of the present disclosure may further include a central server 616. In one example, central server 616 may be associated with accident investigation companies, law enforcement agencies, and/or insurance companies. Certain business applications, such as management applications 618, may reside on central server 616. Management applications 618 may be, for example, any incident management applications.
  • A network 620 provides the communication link between any and/or all entities of networked system 600. For example, network 620 provides the communication network by which information may be exchanged between application server 610, image server 614, central server 616, and computing devices 140. Network 620 may be, for example, any local area network (LAN) and/or wide area network (WAN) for connecting to the Internet.
  • In order to connect to network 620, each entity of networked system 600 includes a communication interface (not shown). For example, the respective communication interfaces of application server 610, image server 614, central server 616, and computing devices 140 may be any wired and/or wireless communication interface by which information may be exchanged between any entities of networked system 600. Examples of wired communication interfaces may include, but are not limited to, USB ports, RS232 connectors, RJ45 connectors, Ethernet, and any combinations thereof. Examples of wireless communication interfaces may include, but are not limited to, an Intranet connection, Internet, Bluetooth® technology, Wi-Fi, Wi-Max, IEEE 802.11 technology, radio frequency (RF), Infrared Data Association (IrDA) compatible protocols, Local Area Networks (LAN), Wide Area Networks (WAN), Shared Wireless Access Protocol (SWAP), any combinations thereof, and other types of wireless networking protocols.
  • In certain embodiments, geo-referenced electronic drawing application 100 may include a feature for attaching media files to reports 132. For example, networked system 600 may include certain media capture devices 622 for capturing media files 624. Media capture devices 622 may be any media capture devices, such as digital cameras, digital audio recorders, digital video recorders, and the like. Therefore, media files 624 may be, for example, digital image files, digital audio files, digital video files, and the like. The media files 624 may likewise have descriptor files (not shown) associated therewith for, for example, associating to certain reports 132. In one example, the media files 624 may be provided as attachments to reports 132. According to other embodiments, computing device 140 may include one or more media capture devices as described above.
  • The attached media files 624 may be stamped with time, location and/or direction information. For example, a media file 624 may include a timestamp identifying a calendar date and/or time that the media file was created and/or a calendar date and/or time that the media file was stored in memory by the computing device 140. Similarly, the media file may include a location stamp identifying a location (e.g., a city and state or geographic coordinates) where the media file was created and/or a location where the media file was stored in memory by the computing device 140. A media file may also include a direction stamp specifying directional information associated therewith. For example, if the media file is a photographic image or video that was taken with a camera device associated with a compass, the photographic image or video may be stamped with directional information based on an output of the compass to indicate that the image or video was taken while the camera lens was facing northwest. In certain embodiments, the media files 624 may be automatically stamped with time, location and/or direction information. The timestamp and location stamp, particularly when automatically generated, may be used as verification that the media file was stored at a particular time and place, such as the time and place where the report associated with the media file was created. The direction stamp may be used as verification that the media file was created while a media capture device was facing in a particular direction or otherwise had a particular orientation. The location, time and/or direction data used for the location stamp, timestamp and/or direction stamp may originate from the computing device on which geo-referenced electronic drawing application is installed, any other computing device. For example, the computing device may be GPS-enabled and may include a timer and a compass. Alternatively, the location, time and/or direction data may be based on manual data entry by the user. It should be appreciated that the media file need not be modified to include the location, time and/or direction data described above, as the data may alternatively be stored in association with the media file as distinct data.
  • As discussed herein, the computing device 140 shown in FIG. 6 may have a communication interface that may receive information from network 620, which may be a LAN and/or WAN for connecting to the Internet. According to one embodiment, information about an environmental condition may be received as a media file via the communication interface. For example, weather information (e.g., temperature, visibility and precipitation information), traffic information and/or construction information, may be received from the Internet via the communication interface. Such information may be received from a weather service, traffic service, traffic records, construction service or the like. Received information may be attached as files to reports 132. Alternatively, or in addition, received information may incorporated within the reports 132 themselves. For example, if the received information indicates that the weather at the time of an accident was sunny, such information may be automatically input to the traffic collision report 400 discussed in connection with FIG. 4. In particular, the report could include this information as text in a data field, or an event-specific image 126 in the report could include an image of a sun or another icon indicating sunny weather. As another example, if the received information indicates that the visibility at the time of the accident was 20 feet, the report could include this information as text in a data field and/or represent this information in an event-specific image 126. For example, to represent the area that could not be viewed by a particular driver, the area beyond a 20 foot radius of the driver in the event-specific image 126 could be colored gray, blacked out, or designated with hash marks. Alternatively, the traffic collision report 400 could be manually updated to include weather information, traffic information, construction information, or the like. Condition information received via the communication interface may be stored with and/or stamped with location, time and/or direction data indicating when the condition information was stored by the computing device 140.
  • In certain embodiments, central server 616 of networked system 600 may include a collection of historical reports 626, which are records of reports 132 that have been processed in the past. In one example, in the context of vehicle accident reports, historical reports 626 may be useful to inform current reports 132, such as current accident reports that are being processed. For example, being able to review historical information pertaining to a certain intersection may be useful to add to an accident report for fault analysis purposes, as certain trends may become apparent. For example, historical reports 626 may indicate for a certain highway or street intersection that a steep hill is present, the traffic light malfunctions, the line of site to the stop sign is obstructed, there is a poor angle of visibility at the intersection, the intersection is an accident prone area in poor weather conditions (e.g., a bridge approaching the intersection freezes over), and the like. Referring again to step 526 of method 500 of FIG. 5, information from historical reports 626 may be other information that may be integrated into reports 132.
  • In operation, each user of networked system 600 may access geo-referenced electronic drawing application 100 via his/her local computing device 140. Networked system 600 may provide a secure login function, which allows users 612 to access the functions of geo-referenced electronic drawing application 100. Once authorized, users 612 may open drawing tool GUI 122 using, for example, the web browsers of their computing devices 140. Geographic location information is read into or manually entered into drawing tool GUI 122 and event-specific images 126, image series 130, and/or reports 132 may be generated as described with reference to FIGS. 1 through 5. In this process, input images 116 of image server 614 may be the source of the geo-referenced images that are read into geo-referenced electronic drawing application 100. Subsequently, reports 132 that include geo-referenced images, such as event-specific images 126, and, optionally, one or more media files 624 attached thereto may be transmitted in electronic form from the computing devices 140 of users 612 to any entities connected to network 620 of networked system 600. In one example, reports 132 that include geo-referenced images may be transmitted in electronic form from the computing devices 140 of users 612 to central server 616 for further review and processing by authorized users only of networked system 600. This is an example of how geo-referenced electronic drawing application 100 is used in networked system 600 to provide improved distribution, tracking, and auditing of reports among entities and to provide improved control over access to reports.
  • Referring again to FIG. 6, networked system 600 is not limited to the types and numbers of entities that are shown in FIG. 6. Any types and numbers of entities that may be useful in event documenting and reporting systems may be included in networked system 600. Further, in another embodiment, geo-referenced electronic drawing application 100 may be a standalone application that resides on each networked computing device 140. Therefore, in this embodiment, networked system 600 of FIG. 6 need not include application server 610.
  • In summary and referring to FIGS. 1 through 6, geo-referenced electronic drawing application 100 of the present disclosure provides the ability to electronically mark up real world geo-referenced images, such as input images 116, with symbols, shapes, and/or lines in order to provide improved accuracy and consistent accuracy with respect to drawings that support incident reports.
  • Further, geo-referenced electronic drawing application 100 of the present disclosure provides the ability to electronically mark up real world geo-referenced images with symbols, shapes, and/or lines to scale, again providing improved accuracy and consistent accuracy with respect to drawings that support incident reports.
  • Further, geo-referenced electronic drawing application 100 of the present disclosure provides a standard symbols library, such as symbols library 114, thereby providing standardization with respect to drawings that support incident reports.
  • Further, networked systems that include geo-referenced electronic drawing application 100 of the present disclosure, such as networked system 600, provide improved distribution, tracking, and auditing of reports among entities and provide improved control over access to reports.
  • CONCLUSION
  • While various inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
  • The above-described embodiments can be implemented in any of numerous ways. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
  • Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
  • Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.
  • Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • FIG. 14 shows an illustrative computer 1400 that may be used at least in part to implement the geo-referenced electronic drawing application 100 described herein in accordance with some embodiments. For example, the computer 1400 comprises a memory 1410, one or more processing units 1412 (also referred to herein simply as “processors”), one or more communication interfaces 1414, one or more display units 1416, and one or more user input devices 1418. The memory 1410 may comprise any computer-readable media, and may store computer instructions (also referred to herein as “processor-executable instructions”) for implementing the various functionalities described herein. The processing unit(s) 1412 may be used to execute the instructions. The communication interface(s) 1414 may be coupled to a wired or wireless network, bus, or other communication means and may therefore allow the computer 1400 to transmit communications to and/or receive communications from other devices. The display unit(s) 1416 may be provided, for example, to allow a user to view various information in connection with execution of the instructions. The user input device(s) 1418 may be provided, for example, to allow the user to make manual adjustments, make selections, enter data or various other information, and/or interact in any of a variety of manners with the processor during execution of the instructions.
  • The various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
  • In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above.
  • The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
  • Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
  • Also, various inventive concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
  • All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
  • The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
  • The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.
  • As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
  • In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.

Claims (29)

1. An apparatus for documenting an incident at an incident site, the apparatus comprising:
a communication interface;
a display device;
at least one user input device;
a memory to store processor-executable instructions; and
a processing unit communicatively coupled to the communication interface, the display device, the at least one user input device, and the memory, wherein upon execution of the processor-executable instructions by the processing unit, the processing unit:
controls the communication interface to electronically receive source data representing at least one input image of a geographic area including the incident site;
controls the display device to display at least a portion of the at least one input image;
acquires first user input from the at least one user input device to provide a first representation of at least a portion of the incident at a first time on the at least one input image;
generates a first marked-up digital image including the first representation based on the first user input;
acquires second user input from the at least one user input device to provide a second representation of at least a portion of the incident at a second time on the at least one input image;
generates a second marked-up digital image including the second representation based on the second user input; and
further controls the communication interface and/or the memory to electronically transmit and/or electronically store information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
2. The apparatus of claim 1, wherein the processing unit:
acquires third user input from the at least one user input device, the third user input indicating an estimate of the first time; and
further controls the communication interface and/or the memory to electronically transmit and/or electronically store the estimate of the first time in association with the information relating to the first marked-up digital image so as to document an estimated time corresponding to the first representation.
3. The apparatus of claim 2, wherein the processing unit:
acquires fourth user input from the at least one user input device, the fourth user input indicating an estimate of the second time; and
further controls the communication interface and/or the memory to electronically transmit and/or electronically store the estimate of the second time in association with the information relating to the second marked-up digital image so as to document an estimated time corresponding to the second representation.
4. The apparatus of claim 1, wherein the processing unit:
acquires third user input from the at least one user input device, the third user input indicating an estimate of the first time; and
modifies the first marked-up digital image to include the estimate of the first time.
5. The apparatus of claim 2, wherein the processing unit:
acquires fourth user input from the at least one user input device, the fourth user input indicating an estimate of the second time; and
modifies the second marked-up digital image to include the estimate of the second time.
6. The apparatus of claim 1, wherein the processing unit further controls the display device to display a series of images as an animated sequence, the series of images comprising the first and second marked-up digital images.
7. The apparatus of claim 6, wherein the processing unit:
acquires third user input from the at least one user input device, the third user input indicating an estimate of the first time;
acquires fourth user input from the at least one user input device, the fourth user input indicating an estimate of the second time; and
controls the display device to display the first and second marked-up digital images at relative times that are based at least in part on the estimates of the first and second times.
8. The apparatus of claim 1, wherein the at least one input image is geo-referenced.
9. The apparatus of claim 8, wherein the processing unit:
scales at least a portion of the first representation and/or second representation based on a scale of the at least one geo-referenced input image.
10. The apparatus of claim 9, wherein the at least a portion of the first representation and/or second representation comprises a symbol selected from a symbol library.
11. The apparatus of claim 8, wherein the processing unit:
acquires geographic location information corresponding to the incident site from a global positioning system; and
acquires, based on the geographic location information, the source data representing the at least one geo-referenced input image of the geographic area including the incident site.
12. The apparatus of claim 8, wherein the at least one geo-referenced input image comprises a first geo-referenced input image, and wherein the processing unit:
generates, using geo-reference data associated with the first geo-referenced input image, a second geo-referenced input image having a different perspective than the first geo-referenced input image.
13. The apparatus of claim 1, wherein:
the incident involves a vehicle; and
the first representation comprises a representation of the vehicle.
14. The apparatus of claim 13, wherein the processing unit:
scales the representation of the vehicle based on a scale of the at least one input image.
15. The apparatus of claim 13, wherein the processing unit:
selects a vehicle symbol corresponding to the vehicle from a plurality of vehicle symbols in a symbol library; and
wherein the first representation comprises the selected vehicle symbol.
16. The apparatus of claim 13, wherein the processing unit:
selects the vehicle symbol based on a vehicle identification number of the vehicle.
17. The apparatus of claim 16, wherein the incident involves a vehicular incident, and wherein the processing unit:
controls the communication interface and/or the memory to electronically transmit and/or electronically store a vehicular incident report including the first and second marked-up digital images.
18. The apparatus of claim 1, wherein the incident involves a personal injury, and wherein the processing unit:
controls the communication interface and/or the memory to electronically transmit and/or electronically store a personal injury report including the first and second marked-up digital images.
19. The apparatus of claim 1, wherein the incident involves property damage, and wherein the processing unit:
controls the communication interface and/or the memory to electronically transmit and/or electronically store a property damage report including the first and second marked-up digital images.
20. The apparatus of claim 1, wherein the incident involves police-investigated activity, and wherein the processing unit:
controls the communication interface and/or the memory to electronically transmit and/or electronically store a police report including the first and second marked-up digital images.
21. The apparatus of claim 1, wherein the processing unit:
controls the communication interface and/or the memory to electronically transmit and/or electronically store an incident report including the first and second marked-up digital images.
22. The apparatus of claim 21, wherein the processing unit:
controls the communication interface and/or the memory to electronically transmit and/or electronically store a descriptor file comprising:
information identifying the incident report; and
information identifying the first and second marked-up digital images.
23. The apparatus of claim 1, wherein the processing unit:
controls the display device to display a symbol palette, the symbol palette comprising a selection of symbols for depicting objects and/or events.
24. The apparatus of claim 23, wherein the selection of symbols comprises at least one landmark symbol.
25. The apparatus of claim 23, wherein the selection of symbols comprises at least one vehicle symbol.
26. The apparatus of claim 23, wherein the selection of symbols comprises at least one person symbol.
27. The apparatus of claim 1, wherein the processing unit:
controls the display device to display a sketching palette, the sketching palette comprising a selection of renderable shapes.
28. A method for documenting an incident at an incident site, the method comprising:
A) receiving source data representing at least one input image of a geographic area including the incident site;
B) processing the source data so as to display at least a portion of the at least one input image on a display device;
C) receiving first user input via at least one user input device associated with the display device;
D) processing the first user input so as to display, on the display device, a first marked-up digital image including a first representation of at least a portion of the incident at a first time on the at least one input image;
E) receiving second user input via the at least one user input device;
F) processing the second user input so as to display, on the display device, a second marked-up digital image including a second representation of at least a portion of the incident at a second time on the at least one input image; and
G) electronically transmitting and/or electronically storing information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
29. At least one computer-readable medium encoded with instructions that, when executed on at least one processing unit, perform a method for documenting an incident at an incident site, the method comprising:
A) receiving source data representing at least one input image of a geographic area including the incident site;
B) processing the source data so as to display at least a portion of the at least one input image on a display device;
C) receiving first user input via at least one user input device associated with the display device;
D) processing the first user input so as to display, on the display device, a first marked-up digital image including a first representation of at least a portion of the incident at a first time on the at least one input image;
E) receiving second user input via the at least one user input device;
F) processing the second user input so as to display, on the display device, a second marked-up digital image including a second representation of at least a portion of the incident at a second time on the at least one input image; and
G) electronically transmitting and/or electronically storing information relating to the first and second marked-up digital images so as to document the incident at different times with respect to the geographic area.
US12/753,687 2009-04-03 2010-04-02 Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings Abandoned US20100256981A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/753,687 US20100256981A1 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16638509P 2009-04-03 2009-04-03
US16639209P 2009-04-03 2009-04-03
US12/753,687 US20100256981A1 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings

Publications (1)

Publication Number Publication Date
US20100256981A1 true US20100256981A1 (en) 2010-10-07

Family

ID=42826905

Family Applications (5)

Application Number Title Priority Date Filing Date
US12/753,664 Abandoned US20100257477A1 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for documenting and reporting events via geo-referenced electronic drawings
US12/753,687 Abandoned US20100256981A1 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings
US12/753,699 Active 2031-03-09 US8260489B2 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US13/568,932 Expired - Fee Related US8612090B2 (en) 2009-04-03 2012-08-07 Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US14/107,550 Abandoned US20140347396A1 (en) 2009-04-03 2013-12-16 Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/753,664 Abandoned US20100257477A1 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for documenting and reporting events via geo-referenced electronic drawings

Family Applications After (3)

Application Number Title Priority Date Filing Date
US12/753,699 Active 2031-03-09 US8260489B2 (en) 2009-04-03 2010-04-02 Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US13/568,932 Expired - Fee Related US8612090B2 (en) 2009-04-03 2012-08-07 Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US14/107,550 Abandoned US20140347396A1 (en) 2009-04-03 2013-12-16 Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations

Country Status (3)

Country Link
US (5) US20100257477A1 (en)
CA (1) CA2761794C (en)
WO (2) WO2010114620A1 (en)

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8060304B2 (en) 2007-04-04 2011-11-15 Certusview Technologies, Llc Marking system and method
US8155390B2 (en) 2008-03-18 2012-04-10 Certusview Technologies, Llc Methods and apparatus for providing unbuffered dig area indicators on aerial images to delimit planned excavation sites
US8194932B2 (en) 2008-02-12 2012-06-05 Certusview Technologies, Llc Ticket approval system for and method of performing quality control in field service applications
US8265344B2 (en) 2008-02-12 2012-09-11 Certusview Technologies, Llc Electronic manifest of underground facility locate operation
US8270666B2 (en) 2008-02-12 2012-09-18 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations
US8280117B2 (en) 2008-03-18 2012-10-02 Certusview Technologies, Llc Virtual white lines for indicating planned excavation sites on electronic images
US8280969B2 (en) 2009-02-10 2012-10-02 Certusview Technologies, Llc Methods, apparatus and systems for requesting underground facility locate and marking operations and managing associated notifications
US8280631B2 (en) 2008-10-02 2012-10-02 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation based on marking device actuations
US8296308B2 (en) 2009-02-11 2012-10-23 Certusview Technologies, Llc Methods and apparatus for associating a virtual white line (VWL) image with corresponding ticket information for an excavation project
US8301380B2 (en) 2008-10-02 2012-10-30 Certusview Technologies, Llp Systems and methods for generating electronic records of locate and marking operations
US8311765B2 (en) 2009-08-11 2012-11-13 Certusview Technologies, Llc Locating equipment communicatively coupled to or equipped with a mobile/portable device
US8400155B2 (en) 2008-10-02 2013-03-19 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a locate operation based on an electronic record of locate information
US8401791B2 (en) 2007-03-13 2013-03-19 Certusview Technologies, Llc Methods for evaluating operation of marking apparatus
US8424486B2 (en) 2008-07-10 2013-04-23 Certusview Technologies, Llc Marker detection mechanisms for use in marking devices and methods of using same
US8442766B2 (en) 2008-10-02 2013-05-14 Certusview Technologies, Llc Marking apparatus having enhanced features for underground facility marking operations, and associated methods and systems
US8463487B2 (en) 2009-08-11 2013-06-11 Certusview Technologies, Llc Systems and methods for complex event processing based on a hierarchical arrangement of complex event processing engines
USD684067S1 (en) 2012-02-15 2013-06-11 Certusview Technologies, Llc Modular marking device
US8473209B2 (en) 2007-03-13 2013-06-25 Certusview Technologies, Llc Marking apparatus and marking methods using marking dispenser with machine-readable ID mechanism
US8478617B2 (en) 2008-10-02 2013-07-02 Certusview Technologies, Llc Methods and apparatus for generating alerts on a locate device, based on comparing electronic locate information to facilities map information and/or other image information
US8510141B2 (en) 2008-10-02 2013-08-13 Certusview Technologies, Llc Methods and apparatus for generating alerts on a marking device, based on comparing electronic marking information to facilities map information and/or other image information
US8527308B2 (en) 2008-10-02 2013-09-03 Certusview Technologies, Llc Methods and apparatus for overlaying electronic locate information on facilities map information and/or other image information displayed on a locate device
US8566737B2 (en) 2009-02-11 2013-10-22 Certusview Technologies, Llc Virtual white lines (VWL) application for indicating an area of planned excavation
US8572193B2 (en) 2009-02-10 2013-10-29 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response in underground facility locate and marking operations
US8583372B2 (en) 2009-12-07 2013-11-12 Certusview Technologies, Llc Methods, apparatus, and systems for facilitating compliance with marking specifications for dispensing marking material
US8583264B2 (en) 2008-10-02 2013-11-12 Certusview Technologies, Llc Marking device docking stations and methods of using same
US8585410B2 (en) 2009-06-25 2013-11-19 Certusview Technologies, Llc Systems for and methods of simulating facilities for use in locate operations training exercises
US8589202B2 (en) 2008-10-02 2013-11-19 Certusview Technologies, Llc Methods and apparatus for displaying and processing facilities map information and/or other image information on a marking device
US8600848B2 (en) 2009-11-05 2013-12-03 Certusview Technologies, Llc Methods, apparatus and systems for ensuring wage and hour compliance in locate operations
US8612271B2 (en) 2008-10-02 2013-12-17 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to environmental landmarks
US8612276B1 (en) 2009-02-11 2013-12-17 Certusview Technologies, Llc Methods, apparatus, and systems for dispatching service technicians
US8620587B2 (en) 2008-10-02 2013-12-31 Certusview Technologies, Llc Methods, apparatus, and systems for generating electronic records of locate and marking operations, and combined locate and marking apparatus for same
US8620726B2 (en) 2008-10-02 2013-12-31 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations by comparing locate information and marking information
US8620616B2 (en) 2009-08-20 2013-12-31 Certusview Technologies, Llc Methods and apparatus for assessing marking operations based on acceleration information
US8620572B2 (en) 2009-08-20 2013-12-31 Certusview Technologies, Llc Marking device with transmitter for triangulating location during locate operations
US8626571B2 (en) 2009-02-11 2014-01-07 Certusview Technologies, Llc Management system, and associated methods and apparatus, for dispatching tickets, receiving field information, and performing a quality assessment for underground facility locate and/or marking operations
US8700325B2 (en) 2007-03-13 2014-04-15 Certusview Technologies, Llc Marking apparatus and methods for creating an electronic record of marking operations
US8749239B2 (en) 2008-10-02 2014-06-10 Certusview Technologies, Llc Locate apparatus having enhanced features for underground facility locate operations, and associated methods and systems
US8805640B2 (en) 2010-01-29 2014-08-12 Certusview Technologies, Llc Locating equipment docking station communicatively coupled to or equipped with a mobile/portable device
US8830265B2 (en) 2009-07-07 2014-09-09 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility marking operations and assessing aspects of same
US20140347396A1 (en) * 2009-04-03 2014-11-27 Certusview Technologies, Llc Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US8902251B2 (en) 2009-02-10 2014-12-02 Certusview Technologies, Llc Methods, apparatus and systems for generating limited access files for searchable electronic records of underground facility locate and/or marking operations
US8918898B2 (en) 2010-07-30 2014-12-23 Certusview Technologies, Llc Methods, apparatus and systems for onsite linking to location-specific electronic records of locate operations
US8965700B2 (en) 2008-10-02 2015-02-24 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of environmental landmarks based on marking device actuations
US8977558B2 (en) 2010-08-11 2015-03-10 Certusview Technologies, Llc Methods, apparatus and systems for facilitating generation and assessment of engineering plans
US9046413B2 (en) 2010-08-13 2015-06-02 Certusview Technologies, Llc Methods, apparatus and systems for surface type detection in connection with locate and marking operations
US9097522B2 (en) 2009-08-20 2015-08-04 Certusview Technologies, Llc Methods and marking devices with mechanisms for indicating and/or detecting marking material color
US9124780B2 (en) 2010-09-17 2015-09-01 Certusview Technologies, Llc Methods and apparatus for tracking motion and/or orientation of a marking device
US9177403B2 (en) 2008-10-02 2015-11-03 Certusview Technologies, Llc Methods and apparatus for overlaying electronic marking information on facilities map information and/or other image information displayed on a marking device
US9208458B2 (en) 2008-10-02 2015-12-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to facilities maps
US9208464B2 (en) 2008-10-02 2015-12-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to historical information
US9280269B2 (en) 2008-02-12 2016-03-08 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
US9473626B2 (en) 2008-06-27 2016-10-18 Certusview Technologies, Llc Apparatus and methods for evaluating a quality of a locate operation for underground utility
US9509772B1 (en) 2014-02-13 2016-11-29 Google Inc. Visualization and control of ongoing ingress actions
US9507791B2 (en) 2014-06-12 2016-11-29 Google Inc. Storage system user interface with floating file collection
US9531722B1 (en) 2013-10-31 2016-12-27 Google Inc. Methods for generating an activity stream
US9536199B1 (en) 2014-06-09 2017-01-03 Google Inc. Recommendations based on device usage
US9542457B1 (en) 2013-11-07 2017-01-10 Google Inc. Methods for displaying object history information
US9563863B2 (en) 2009-02-11 2017-02-07 Certusview Technologies, Llc Marking apparatus equipped with ticket processing software for facilitating marking operations, and associated methods
US9614880B1 (en) 2013-11-12 2017-04-04 Google Inc. Methods for real-time notifications in an activity stream
US9646275B2 (en) 2009-06-25 2017-05-09 Certusview Technologies, Llc Methods and apparatus for assessing risks associated with locate request tickets based on historical information
US9721302B2 (en) * 2012-05-24 2017-08-01 State Farm Mutual Automobile Insurance Company Server for real-time accident documentation and claim submission
US9870420B2 (en) 2015-01-19 2018-01-16 Google Llc Classification and storage of documents
US9916588B2 (en) 2008-06-27 2018-03-13 Certusview Technologies, Llc Methods and apparatus for quality assessment of a field service operation based on dynamic assessment parameters
US10078781B2 (en) * 2014-06-13 2018-09-18 Google Llc Automatically organizing images

Families Citing this family (141)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140354821A1 (en) * 1998-08-28 2014-12-04 David A. Monroe Covert Networked Security Camera
US10878646B2 (en) 2005-12-08 2020-12-29 Smartdrive Systems, Inc. Vehicle event recorder systems
US8996240B2 (en) 2006-03-16 2015-03-31 Smartdrive Systems, Inc. Vehicle event recorders with integrated web server
US9201842B2 (en) 2006-03-16 2015-12-01 Smartdrive Systems, Inc. Vehicle event recorder systems and networks having integrated cellular wireless communications systems
US8989959B2 (en) 2006-11-07 2015-03-24 Smartdrive Systems, Inc. Vehicle operator performance history recording, scoring and reporting systems
US8649933B2 (en) 2006-11-07 2014-02-11 Smartdrive Systems Inc. Power management systems for automotive video event recorders
US8868288B2 (en) 2006-11-09 2014-10-21 Smartdrive Systems, Inc. Vehicle exception event management systems
US8239092B2 (en) 2007-05-08 2012-08-07 Smartdrive Systems Inc. Distributed vehicle event recorder systems having a portable memory data transfer system
US9818157B2 (en) 2008-10-07 2017-11-14 State Farm Mutual Automobile Insurance Company Method for using electronic metadata to verify insurance claims
AU2010263264B2 (en) * 2009-06-25 2015-02-12 Certusview Technologies, Llc Locating equipment for and methods of simulating locate operations for training and/or skills evaluation
US8480332B2 (en) 2009-09-23 2013-07-09 Certusview Technologies, Llc Laying and protecting cable into existing covering surfaces
KR101650948B1 (en) * 2009-11-17 2016-08-24 엘지전자 주식회사 Method for displaying time information and display apparatus thereof
KR101714781B1 (en) 2009-11-17 2017-03-22 엘지전자 주식회사 Method for playing contents
KR101585692B1 (en) 2009-11-17 2016-01-14 엘지전자 주식회사 Method for displaying contents information
US9070305B1 (en) * 2010-01-22 2015-06-30 Google Inc. Traffic light detecting system and method
US9104202B2 (en) * 2010-05-11 2015-08-11 Irobot Corporation Remote vehicle missions and systems for supporting remote vehicle missions
EP2392898B1 (en) * 2010-06-04 2017-12-13 Sensirion AG Sensor system
US8989950B2 (en) 2011-02-15 2015-03-24 Bosch Automotive Service Solutions Llc Diagnostic tool with smart camera
TW201235921A (en) * 2011-02-22 2012-09-01 Hon Hai Prec Ind Co Ltd Weighable mobile terminal and method
CN102651780A (en) * 2011-02-23 2012-08-29 鸿富锦精密工业(深圳)有限公司 Mobile terminal and method for weighing object
US9140567B2 (en) 2011-03-03 2015-09-22 Telogis, Inc. Vehicle route calculation
US9317860B2 (en) 2011-03-08 2016-04-19 Bank Of America Corporation Collective network of augmented reality users
US9224166B2 (en) 2011-03-08 2015-12-29 Bank Of America Corporation Retrieving product information from embedded sensors via mobile device video analysis
US8873807B2 (en) * 2011-03-08 2014-10-28 Bank Of America Corporation Vehicle recognition
US9317835B2 (en) 2011-03-08 2016-04-19 Bank Of America Corporation Populating budgets and/or wish lists using real-time video image analysis
US9773285B2 (en) 2011-03-08 2017-09-26 Bank Of America Corporation Providing data associated with relationships between individuals and images
US8989700B2 (en) * 2011-03-17 2015-03-24 The Cordero Group Method and system for an interactive community alert network
US20130006674A1 (en) 2011-06-29 2013-01-03 State Farm Insurance Systems and Methods Using a Mobile Device to Collect Data for Insurance Premiums
US10977601B2 (en) 2011-06-29 2021-04-13 State Farm Mutual Automobile Insurance Company Systems and methods for controlling the collection of vehicle use data using a mobile device
CN102243663A (en) * 2011-08-01 2011-11-16 烟台杰瑞网络商贸有限公司 Dynamic labeling method based on electronic drawing
US8996234B1 (en) * 2011-10-11 2015-03-31 Lytx, Inc. Driver performance determination based on geolocation
US9298575B2 (en) 2011-10-12 2016-03-29 Lytx, Inc. Drive event capturing based on geolocation
US9158789B2 (en) 2011-12-30 2015-10-13 International Business Machines Corporation Coordinated geospatial, list-based and filter-based selection
US20130191189A1 (en) * 2012-01-19 2013-07-25 Siemens Corporation Non-enforcement autonomous parking management system and methods
US20130238747A1 (en) 2012-03-06 2013-09-12 Apple Inc. Image beaming for a media editing application
US9569078B2 (en) 2012-03-06 2017-02-14 Apple Inc. User interface tools for cropping and straightening image
US9131192B2 (en) 2012-03-06 2015-09-08 Apple Inc. Unified slider control for modifying multiple image properties
US20130254133A1 (en) * 2012-03-21 2013-09-26 RiskJockey, Inc. Proactive evidence dissemination
US9387813B1 (en) * 2012-03-21 2016-07-12 Road-Iq, Llc Device, system and method for aggregating networks and serving data from those networks to computers
US8731768B2 (en) * 2012-05-22 2014-05-20 Hartford Fire Insurance Company System and method to provide telematics data on a map display
US8977426B2 (en) 2012-06-04 2015-03-10 Geotab Inc. VIN based accelerometer threshold
US8756248B1 (en) * 2012-06-26 2014-06-17 C. Joseph Rickrode Rapid access information database (RAID) system and method for mobile entity data aggregation
US9194702B2 (en) * 2012-06-29 2015-11-24 Symbol Technologies, Llc Methods and apparatus for adjusting heading direction in a navigation system
WO2014020995A1 (en) * 2012-07-31 2014-02-06 古野電気株式会社 Meteorological information display system, human navigation device, meteorological information display program and meteorological information display method
US20150206330A1 (en) * 2012-07-31 2015-07-23 Furuno Electric Co., Ltd. Weather information display system, human navigation device, and method of displaying weather information
US9728228B2 (en) 2012-08-10 2017-08-08 Smartdrive Systems, Inc. Vehicle event playback apparatus and methods
US9460416B2 (en) * 2012-08-16 2016-10-04 Microsoft Technology Licensing, Llc Reading mode for interactive slide presentations with accompanying notes
US11086196B2 (en) * 2012-08-31 2021-08-10 Audatex North America, Llc Photo guide for vehicle
US9344683B1 (en) 2012-11-28 2016-05-17 Lytx, Inc. Capturing driving risk based on vehicle state and automatic detection of a state of a location
US9342806B2 (en) 2013-02-28 2016-05-17 P800X, Llc Method and system for automated project management
US10496942B2 (en) 2013-02-28 2019-12-03 P800X, Llc Method and system for automated project management of excavation requests
US20140277993A1 (en) * 2013-03-18 2014-09-18 Donald William HOOKWAY Motor Vehicle Lift Control System
IN2013MU00897A (en) * 2013-03-20 2015-05-29 Tata Consultancy Services Ltd
GB2512331A (en) * 2013-03-26 2014-10-01 Vas System Ltd Inputting and displaying data
US20140316825A1 (en) * 2013-04-18 2014-10-23 Audatex North America, Inc. Image based damage recognition and repair cost estimation
JP6183086B2 (en) * 2013-09-13 2017-08-23 カシオ計算機株式会社 Communication apparatus and communication program
US9558408B2 (en) * 2013-10-15 2017-01-31 Ford Global Technologies, Llc Traffic signal prediction
US9501878B2 (en) 2013-10-16 2016-11-22 Smartdrive Systems, Inc. Vehicle event playback apparatus and methods
US9412031B2 (en) * 2013-10-16 2016-08-09 Xerox Corporation Delayed vehicle identification for privacy enforcement
US9892567B2 (en) 2013-10-18 2018-02-13 State Farm Mutual Automobile Insurance Company Vehicle sensor collection of other vehicle information
US8954226B1 (en) * 2013-10-18 2015-02-10 State Farm Mutual Automobile Insurance Company Systems and methods for visualizing an accident involving a vehicle
US9262787B2 (en) 2013-10-18 2016-02-16 State Farm Mutual Automobile Insurance Company Assessing risk using vehicle environment information
US9361650B2 (en) 2013-10-18 2016-06-07 State Farm Mutual Automobile Insurance Company Synchronization of vehicle sensor information
US10121291B2 (en) * 2013-10-29 2018-11-06 Ford Global Technologies, Llc Method and apparatus for visual accident detail reporting
US9610955B2 (en) 2013-11-11 2017-04-04 Smartdrive Systems, Inc. Vehicle fuel consumption monitor and feedback systems
KR20150059490A (en) * 2013-11-22 2015-06-01 현대자동차주식회사 System for transmitting an accident data and method thereof
TWI516950B (en) * 2013-12-23 2016-01-11 勝捷光電股份有限公司 System and method for sharing real-time recording
US8892310B1 (en) 2014-02-21 2014-11-18 Smartdrive Systems, Inc. System and method to detect execution of driving maneuvers
JP5689993B1 (en) * 2014-02-28 2015-03-25 株式会社ブリヂストン Vehicle running state monitoring device
WO2015134311A1 (en) * 2014-03-03 2015-09-11 Inrix Inc Traffic obstruction detection
US9428056B2 (en) * 2014-03-11 2016-08-30 Textron Innovations, Inc. Adjustable synthetic vision
US10347140B2 (en) 2014-03-11 2019-07-09 Textron Innovations Inc. Flight planning and communication
US10042456B2 (en) 2014-03-11 2018-08-07 Textron Innovations Inc. User interface for an aircraft
US9772712B2 (en) 2014-03-11 2017-09-26 Textron Innovations, Inc. Touch screen instrument panel
KR20150108701A (en) * 2014-03-18 2015-09-30 삼성전자주식회사 System and method for visualizing anatomic elements in a medical image
US9189839B1 (en) 2014-04-24 2015-11-17 Google Inc. Automatically generating panorama tours
US10319039B1 (en) 2014-05-20 2019-06-11 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US11669090B2 (en) 2014-05-20 2023-06-06 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US10354330B1 (en) 2014-05-20 2019-07-16 State Farm Mutual Automobile Insurance Company Autonomous feature use monitoring and insurance pricing
US10185999B1 (en) 2014-05-20 2019-01-22 State Farm Mutual Automobile Insurance Company Autonomous feature use monitoring and telematics
US10599155B1 (en) 2014-05-20 2020-03-24 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US10373259B1 (en) 2014-05-20 2019-08-06 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
US9972054B1 (en) 2014-05-20 2018-05-15 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US9826164B2 (en) * 2014-05-30 2017-11-21 Furuno Electric Co., Ltd. Marine environment display device
US9002647B1 (en) 2014-06-27 2015-04-07 Google Inc. Generating turn-by-turn direction previews
USD757047S1 (en) * 2014-07-11 2016-05-24 Google Inc. Display screen with animated graphical user interface
US10475127B1 (en) 2014-07-21 2019-11-12 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and insurance incentives
US20160073061A1 (en) * 2014-09-04 2016-03-10 Adesa, Inc. Vehicle Documentation System
US10336321B1 (en) 2014-11-13 2019-07-02 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
US11069257B2 (en) * 2014-11-13 2021-07-20 Smartdrive Systems, Inc. System and method for detecting a vehicle event and generating review criteria
US9679420B2 (en) 2015-04-01 2017-06-13 Smartdrive Systems, Inc. Vehicle event recording system and method
CN112634643B (en) * 2015-05-29 2022-06-14 荣耀终端有限公司 Traffic information updating method and device
US9767564B2 (en) 2015-08-14 2017-09-19 International Business Machines Corporation Monitoring of object impressions and viewing patterns
US9805601B1 (en) 2015-08-28 2017-10-31 State Farm Mutual Automobile Insurance Company Vehicular traffic alerts for avoidance of abnormal traffic conditions
US20170069033A1 (en) * 2015-09-04 2017-03-09 Red Fox, Inc. OBA iosweep, Inc. Incident Reporting Assistance System
WO2017053612A1 (en) * 2015-09-25 2017-03-30 Nyqamin Dynamics Llc Automated capture of image data for points of interest
US10818382B1 (en) 2015-11-20 2020-10-27 Massachusetts Mutual Life Insurance Company Systems, methods, and apparatus for acquiring data
US11441916B1 (en) 2016-01-22 2022-09-13 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
US11719545B2 (en) 2016-01-22 2023-08-08 Hyundai Motor Company Autonomous vehicle component damage and salvage assessment
US10324463B1 (en) 2016-01-22 2019-06-18 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation adjustment based upon route
US10134278B1 (en) 2016-01-22 2018-11-20 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US10545024B1 (en) 2016-01-22 2020-01-28 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
US9940834B1 (en) 2016-01-22 2018-04-10 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US10395332B1 (en) 2016-01-22 2019-08-27 State Farm Mutual Automobile Insurance Company Coordinated autonomous vehicle automatic area scanning
US11242051B1 (en) 2016-01-22 2022-02-08 State Farm Mutual Automobile Insurance Company Autonomous vehicle action communications
US9846915B2 (en) * 2016-03-17 2017-12-19 Conduent Business Services, Llc Image capture system for property damage assessment
US10511676B2 (en) 2016-03-17 2019-12-17 Conduent Business Services, Llc Image analysis system for property damage assessment and verification
US10152836B2 (en) * 2016-04-19 2018-12-11 Mitchell International, Inc. Systems and methods for use of diagnostic scan tool in automotive collision repair
US9870609B2 (en) 2016-06-03 2018-01-16 Conduent Business Services, Llc System and method for assessing usability of captured images
US11120505B2 (en) 2016-06-03 2021-09-14 Conduent Business Services, Llc Image analysis system for verification of property roof damage
US11055786B2 (en) 2016-06-03 2021-07-06 Conduent Business Services, Llc Image segmentation system for verification of property roof damage
CN106297290A (en) * 2016-08-27 2017-01-04 时空链(北京)科技有限公司 A kind of driving behavior processing method, mobile unit and cloud server
WO2018066329A1 (en) * 2016-10-03 2018-04-12 日立オートモティブシステムズ株式会社 In-vehicle electronic control apparatus
US10580234B2 (en) 2017-01-20 2020-03-03 Adesa, Inc. Vehicle documentation system
DE102017202763A1 (en) 2017-02-21 2018-08-23 Telemotive Aktiengesellschaft Method for evaluating vehicle operating information
US10885652B2 (en) * 2017-03-22 2021-01-05 Magna Electronics Inc. Trailer angle detection system for vehicle
CA2968594A1 (en) * 2017-05-11 2018-11-11 The Manitoba Public Insurance Corporation Method for automated insured loss claim profile determination
US10852341B2 (en) * 2017-06-16 2020-12-01 Florida Power & Light Company Composite fault mapping
AU2018309077A1 (en) * 2017-08-04 2020-02-20 Cambridge Mobile Telematics Inc Method and system for accident detection using contextual data
GB201719108D0 (en) * 2017-11-17 2018-01-03 Xtract360 Ltd Collision evaluation
US10719963B2 (en) * 2017-11-27 2020-07-21 Uber Technologies, Inc. Graphical user interface map feature for a network service
US11377123B2 (en) 2017-12-22 2022-07-05 Nissan North America, Inc. Solution path overlay interfaces for autonomous vehicles
DE102018204501B3 (en) * 2018-03-23 2019-07-04 Continental Automotive Gmbh System for generating confidence values in the backend
ES2736901A1 (en) 2018-06-29 2020-01-08 Geotab Inc Characterization of a vehicle collision (Machine-translation by Google Translate, not legally binding)
JP7286287B2 (en) * 2018-09-14 2023-06-05 株式会社小松製作所 Work machine display system and its control method
US11741763B2 (en) 2018-12-26 2023-08-29 Allstate Insurance Company Systems and methods for system generated damage analysis
EP3895146A1 (en) * 2019-01-24 2021-10-20 Mobileye Vision Technologies Ltd. Clustering event information for vehicle navigation
US11494847B2 (en) 2019-08-29 2022-11-08 Toyota Motor North America, Inc. Analysis of transport damage
US11024169B2 (en) * 2019-09-09 2021-06-01 International Business Machines Corporation Methods and systems for utilizing vehicles to investigate events
US11525243B2 (en) * 2019-09-16 2022-12-13 Caterpillar Inc. Image-based productivity tracking system
US20210110433A1 (en) * 2019-10-10 2021-04-15 Ford Global Technologies, Llc Vehicle caching of local business data
US11747147B2 (en) * 2019-12-30 2023-09-05 Gm Cruise Holdings Llc Dynamic map rendering
US11254316B2 (en) * 2020-01-24 2022-02-22 Ford Global Technologies, Llc Driver distraction detection
US11710186B2 (en) 2020-04-24 2023-07-25 Allstate Insurance Company Determining geocoded region based rating systems for decisioning outputs
CN111862593B (en) * 2020-06-03 2022-04-01 阿波罗智联(北京)科技有限公司 Method and device for reporting traffic events, electronic equipment and storage medium
KR20210158705A (en) * 2020-06-24 2021-12-31 현대자동차주식회사 Vehicle and control method thereof
US11709870B2 (en) 2020-07-08 2023-07-25 Worster Construction Management Llc Comprehensive utility line database and user interface for excavation sites
US11862022B2 (en) 2021-02-03 2024-01-02 Geotab Inc. Methods for characterizing a vehicle collision
US11884285B2 (en) 2021-02-03 2024-01-30 Geotab Inc. Systems for characterizing a vehicle collision
CN112565465A (en) * 2021-02-19 2021-03-26 智道网联科技(北京)有限公司 Data acquisition method, device and system based on Internet of vehicles
CN114419199B (en) * 2021-12-20 2023-11-07 北京百度网讯科技有限公司 Picture marking method and device, electronic equipment and storage medium

Citations (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6018980A (en) * 1997-04-21 2000-02-01 Nec Home Electronics, Ltd. Method and device for determining deformation of a vehicle side part
US6185490B1 (en) * 1999-03-15 2001-02-06 Thomas W. Ferguson Vehicle crash data recorder
US6246933B1 (en) * 1999-11-04 2001-06-12 BAGUé ADOLFO VAEZA Traffic accident data recorder and traffic accident reproduction system and method
US20030046003A1 (en) * 2001-09-06 2003-03-06 Wdt Technologies, Inc. Accident evidence recording method
US20030125853A1 (en) * 2001-12-29 2003-07-03 Masahito Takagi Traffic accident recording system
US6704644B1 (en) * 1999-07-29 2004-03-09 Aioi Insurance Co., Ltd. Consultation business support system
US6882912B2 (en) * 2002-03-19 2005-04-19 Ford Global Technologies, Llc Real time stamping synchronization system
US20050129324A1 (en) * 2003-12-02 2005-06-16 Lemke Alan P. Digital camera and method providing selective removal and addition of an imaged object
US20060031103A1 (en) * 2004-08-06 2006-02-09 Henry David S Systems and methods for diagram data collection
US20060242418A1 (en) * 2005-04-25 2006-10-26 Xerox Corporation Method for ensuring the integrity of image sets
US20070044539A1 (en) * 2005-03-01 2007-03-01 Bryan Sabol System and method for visual representation of a catastrophic event and coordination of response
US7260273B2 (en) * 2003-08-08 2007-08-21 Seiko Epson Corporation System and method of editing a digital image to remove unwanted artifacts, objects and the like
US20080042410A1 (en) * 1995-10-30 2008-02-21 Automotive Technologies International, Inc. Vehicular Electrical System with Crash Sensors and Occupant Protection Systems
US20080052134A1 (en) * 2006-05-18 2008-02-28 Vikki Nowak Rich claim reporting system
US7418131B2 (en) * 2004-08-27 2008-08-26 National Cheng Kung University Image-capturing device and method for removing strangers from an image
US20090013928A1 (en) * 2007-04-04 2009-01-15 Certusview Technologies, Llc Marking system and method
US20090051515A1 (en) * 2005-04-15 2009-02-26 Nikon Corporation Imaging Apparatus and Drive Recorder System
US20090073191A1 (en) * 2005-04-21 2009-03-19 Microsoft Corporation Virtual earth rooftop overlay and bounding
US20090202101A1 (en) * 2008-02-12 2009-08-13 Dycom Technology, Llc Electronic manifest of underground facility locate marks
US20090201178A1 (en) * 2007-03-13 2009-08-13 Nielsen Steven E Methods for evaluating operation of marking apparatus
US20100010882A1 (en) * 2008-06-27 2010-01-14 Certusview Technologies, Llc Methods and apparatus for quality assessment of a field service operation based on dynamic assessment parameters
US20100006667A1 (en) * 2008-07-10 2010-01-14 Nielsen Steven E Marker detection mechanisms for use in marking devices and methods of using same
US7660725B2 (en) * 2002-11-27 2010-02-09 Computer Sciences Corporation Computerized method and system for estimating an effect on liability based on the stopping distance of vehicles
US20100085701A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Marking device docking stations having security features and methods of using same
US20100086671A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation including service-related information and/or ticket information
US20100088134A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to historical information
US20100085054A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Systems and methods for generating electronic records of locate and marking operations
US20100085185A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating electronic records of locate operations
US20100088032A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods, apparatus, and systems for generating electronic records of locate and marking operations, and combined locate and marking apparatus for same
US20100088164A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to facilities maps
US20100088135A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to environmental landmarks
US20100088031A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of environmental landmarks based on marking device actuations
US20100189312A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for overlaying electronic locate information on facilities map information and/or other image information displayed on a locate device
US20100188407A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for displaying and processing facilities map information and/or other image information on a marking device
US20100189887A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Marking apparatus having enhanced features for underground facility marking operations, and associated methods and systems
US20100188088A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for displaying and processing facilities map information and/or other image information on a locate device
US20100188245A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Locate apparatus having enhanced features for underground facility locate operations, and associated methods and systems
US20100188215A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for generating alerts on a marking device, based on comparing electronic marking information to facilities map information and/or other image information
US20100188216A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for generating alerts on a locate device, based on comparing electronic locate information to facilities map information and/or other image information
US20110007076A1 (en) * 2009-07-07 2011-01-13 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US20110022433A1 (en) * 2009-06-25 2011-01-27 Certusview Technologies, Llc Methods and apparatus for assessing locate request tickets
US20110020776A1 (en) * 2009-06-25 2011-01-27 Certusview Technologies, Llc Locating equipment for and methods of simulating locate operations for training and/or skills evaluation
US20110035245A1 (en) * 2009-02-11 2011-02-10 Certusview Technologies, Llc Methods, apparatus, and systems for processing technician workflows for locate and/or marking operations
US7890353B2 (en) * 2000-10-02 2011-02-15 Computer Sciences Corporation Computerized method and system of liability assessment for an accident using environmental, vehicle, and driver conditions and driver actions
US20110045175A1 (en) * 2009-08-20 2011-02-24 Certusview Technologies, Llc Methods and marking devices with mechanisms for indicating and/or detecting marking material color
US20110046999A1 (en) * 2008-10-02 2011-02-24 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations by comparing locate information and marking information
US20110060549A1 (en) * 2009-08-20 2011-03-10 Certusview Technologies, Llc Methods and apparatus for assessing marking operations based on acceleration information
US20110060496A1 (en) * 2009-08-11 2011-03-10 Certusview Technologies, Llc Systems and methods for complex event processing of vehicle information and image information relating to a vehicle
US20110131081A1 (en) * 2009-02-10 2011-06-02 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response in underground facility locate and marking operations
US20110137769A1 (en) * 2009-11-05 2011-06-09 Certusview Technologies, Llc Methods, apparatus and systems for ensuring wage and hour compliance in locate operations
US20120036140A1 (en) * 2010-08-05 2012-02-09 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations by comparing filtered locate and/or marking information
US20120065944A1 (en) * 2010-08-11 2012-03-15 Certusview Technologies, Llc Methods, apparatus and systems for facilitating generation and assessment of engineering plans
US20120065924A1 (en) * 2010-08-13 2012-03-15 Certusview Technologies, Llc Methods, apparatus and systems for surface type detection in connection with locate and marking operations
US20120066273A1 (en) * 2010-07-30 2012-03-15 Certusview Technologies, Llc System for and methods of automatically inserting symbols into electronic records of locate operations
US20120066506A1 (en) * 2010-07-30 2012-03-15 Certusview Technologies, Llc Methods, apparatus and systems for onsite linking to location-specific electronic records of locate operations
US20120066137A1 (en) * 2010-07-30 2012-03-15 CertusView Technolgies, LLC System for and methods of confirming locate operation work orders with respect to municipal permits
US20120072035A1 (en) * 2010-09-17 2012-03-22 Steven Nielsen Methods and apparatus for dispensing material and electronically tracking same
US20120069178A1 (en) * 2010-09-17 2012-03-22 Certusview Technologies, Llc Methods and apparatus for tracking motion and/or orientation of a marking device
US8155390B2 (en) * 2008-03-18 2012-04-10 Certusview Technologies, Llc Methods and apparatus for providing unbuffered dig area indicators on aerial images to delimit planned excavation sites
US20120110019A1 (en) * 2009-02-10 2012-05-03 Certusview Technologies, Llc Methods, apparatus and systems for generating limited access files for searchable electronic records of underground facility locate and/or marking operations
US20120113244A1 (en) * 2010-08-13 2012-05-10 Certusview Technologies, Llc Methods, apparatus and systems for marking material color detection in connection with locate and marking operations
US20130002854A1 (en) * 2010-09-17 2013-01-03 Certusview Technologies, Llc Marking methods, apparatus and systems including optical flow-based dead reckoning features
US20130006718A1 (en) * 2011-07-01 2013-01-03 Certusview Technologies, Llc Methods, apparatus and systems for chronicling the activities of field technicians
US20130004918A1 (en) * 2011-03-23 2013-01-03 Salah Huwais Fluted osteotome and surgical method for use
US8480332B2 (en) * 2009-09-23 2013-07-09 Certusview Technologies, Llc Laying and protecting cable into existing covering surfaces
US20130186333A1 (en) * 2008-10-02 2013-07-25 Steven Nielsen Methods and apparatus for overlaying electronic marking information on facilities map information and/or other image information displayed on a marking device
US8630463B2 (en) * 2008-02-12 2014-01-14 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2521024B2 (en) * 1993-04-20 1996-07-31 淡路フェリーボート株式会社 Traffic accident data recorder and traffic accident reproduction system
US6141611A (en) * 1998-12-01 2000-10-31 John J. Mackey Mobile vehicle accident data system
JP2001076012A (en) * 1999-08-31 2001-03-23 Hitachi Ltd Method and device for gathering vehicle information
US6493650B1 (en) * 2000-01-27 2002-12-10 Optimus Corporation Device for automatic documentation of crash scenes
JP2002042288A (en) * 2000-07-26 2002-02-08 Yazaki Corp Running state recording device and running control system using it
US7386376B2 (en) * 2002-01-25 2008-06-10 Intelligent Mechatronic Systems, Inc. Vehicle visual and non-visual data recording system
US6580981B1 (en) * 2002-04-16 2003-06-17 Meshnetworks, Inc. System and method for providing wireless telematics store and forward messaging for peer-to-peer and peer-to-peer-to-infrastructure a communication network
US7818187B2 (en) * 2002-11-27 2010-10-19 Computer Sciences Corporation Computerized method and system for estimating liability
WO2006120911A1 (en) * 2005-05-09 2006-11-16 Nikon Corporation Imaging device and drive recorder system
US9086277B2 (en) 2007-03-13 2015-07-21 Certusview Technologies, Llc Electronically controlled marking apparatus and methods
US8473209B2 (en) 2007-03-13 2013-06-25 Certusview Technologies, Llc Marking apparatus and marking methods using marking dispenser with machine-readable ID mechanism
DE102007042577B3 (en) 2007-09-07 2009-04-02 Continental Automotive Gmbh Method for controlling a combustion process and control unit
US8280117B2 (en) * 2008-03-18 2012-10-02 Certusview Technologies, Llc Virtual white lines for indicating planned excavation sites on electronic images
US9659268B2 (en) 2008-02-12 2017-05-23 CertusVies Technologies, LLC Ticket approval system for and method of performing quality control in field service applications
US20100036683A1 (en) * 2008-08-05 2010-02-11 Logan Andrew J Diagramming tool for vehicle insurance claims
US20100257477A1 (en) * 2009-04-03 2010-10-07 Certusview Technologies, Llc Methods, apparatus, and systems for documenting and reporting events via geo-referenced electronic drawings
CA2710269C (en) * 2009-08-11 2012-05-22 Certusview Technologies, Llc Locating equipment communicatively coupled to or equipped with a mobile/portable device
CA2713282C (en) * 2009-08-20 2013-03-19 Certusview Technologies, Llc Marking device with transmitter for triangulating location during marking operations
WO2011094703A1 (en) * 2010-01-29 2011-08-04 Certusview Technologies, Llc Locating equipment docking station communicatively coupled to or equipped with a mobile/portable device
US20120328162A1 (en) 2011-06-22 2012-12-27 Certusview Technologies, Llc Methods, apparatus, and systems for performing installations of engineered systems and generating site visit manifests for same
US9358735B2 (en) * 2011-11-29 2016-06-07 Novartis Ag Method of treating a lens forming surface of at least one mold half of a mold for molding ophthalmic lenses

Patent Citations (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080042410A1 (en) * 1995-10-30 2008-02-21 Automotive Technologies International, Inc. Vehicular Electrical System with Crash Sensors and Occupant Protection Systems
US6018980A (en) * 1997-04-21 2000-02-01 Nec Home Electronics, Ltd. Method and device for determining deformation of a vehicle side part
US6185490B1 (en) * 1999-03-15 2001-02-06 Thomas W. Ferguson Vehicle crash data recorder
US6704644B1 (en) * 1999-07-29 2004-03-09 Aioi Insurance Co., Ltd. Consultation business support system
US6246933B1 (en) * 1999-11-04 2001-06-12 BAGUé ADOLFO VAEZA Traffic accident data recorder and traffic accident reproduction system and method
US7890353B2 (en) * 2000-10-02 2011-02-15 Computer Sciences Corporation Computerized method and system of liability assessment for an accident using environmental, vehicle, and driver conditions and driver actions
US20030046003A1 (en) * 2001-09-06 2003-03-06 Wdt Technologies, Inc. Accident evidence recording method
US20030125853A1 (en) * 2001-12-29 2003-07-03 Masahito Takagi Traffic accident recording system
US6882912B2 (en) * 2002-03-19 2005-04-19 Ford Global Technologies, Llc Real time stamping synchronization system
US7660725B2 (en) * 2002-11-27 2010-02-09 Computer Sciences Corporation Computerized method and system for estimating an effect on liability based on the stopping distance of vehicles
US7260273B2 (en) * 2003-08-08 2007-08-21 Seiko Epson Corporation System and method of editing a digital image to remove unwanted artifacts, objects and the like
US20050129324A1 (en) * 2003-12-02 2005-06-16 Lemke Alan P. Digital camera and method providing selective removal and addition of an imaged object
US20060031103A1 (en) * 2004-08-06 2006-02-09 Henry David S Systems and methods for diagram data collection
US7418131B2 (en) * 2004-08-27 2008-08-26 National Cheng Kung University Image-capturing device and method for removing strangers from an image
US20070044539A1 (en) * 2005-03-01 2007-03-01 Bryan Sabol System and method for visual representation of a catastrophic event and coordination of response
US20090051515A1 (en) * 2005-04-15 2009-02-26 Nikon Corporation Imaging Apparatus and Drive Recorder System
US20090073191A1 (en) * 2005-04-21 2009-03-19 Microsoft Corporation Virtual earth rooftop overlay and bounding
US20060242418A1 (en) * 2005-04-25 2006-10-26 Xerox Corporation Method for ensuring the integrity of image sets
US20080052134A1 (en) * 2006-05-18 2008-02-28 Vikki Nowak Rich claim reporting system
US8775077B2 (en) * 2007-03-13 2014-07-08 Certusview Technologies, Llc Systems and methods for using location data to electronically display dispensing of markers by a marking system or marking tool
US20090201178A1 (en) * 2007-03-13 2009-08-13 Nielsen Steven E Methods for evaluating operation of marking apparatus
US20100094553A1 (en) * 2007-03-13 2010-04-15 Certusview Technologies, Llc Systems and methods for using location data and/or time data to electronically display dispensing of markers by a marking system or marking tool
US20100090858A1 (en) * 2007-04-04 2010-04-15 Certusview Technologies, Llc Systems and methods for using marking information to electronically display dispensing of markers by a marking system or marking tool
US20090013928A1 (en) * 2007-04-04 2009-01-15 Certusview Technologies, Llc Marking system and method
US20130174072A9 (en) * 2008-02-12 2013-07-04 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US20090202101A1 (en) * 2008-02-12 2009-08-13 Dycom Technology, Llc Electronic manifest of underground facility locate marks
US20130135343A1 (en) * 2008-02-12 2013-05-30 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US20090201311A1 (en) * 2008-02-12 2009-08-13 Steven Nielsen Electronic manifest of underground facility locate marks
US20140022272A1 (en) * 2008-02-12 2014-01-23 Certusview Technologies, Llc Electronically documenting locate operations for underground utilities
US8630463B2 (en) * 2008-02-12 2014-01-14 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations
US8155390B2 (en) * 2008-03-18 2012-04-10 Certusview Technologies, Llc Methods and apparatus for providing unbuffered dig area indicators on aerial images to delimit planned excavation sites
US20100010863A1 (en) * 2008-06-27 2010-01-14 Certusview Technologies, Llc Methods and apparatus for quality assessment of a field service operation based on multiple scoring categories
US20100010883A1 (en) * 2008-06-27 2010-01-14 Certusview Technologies, Llc Methods and apparatus for facilitating a quality assessment of a field service operation based on multiple quality assessment criteria
US20100010862A1 (en) * 2008-06-27 2010-01-14 Certusview Technologies, Llc Methods and apparatus for quality assessment of a field service operation based on geographic information
US20100010882A1 (en) * 2008-06-27 2010-01-14 Certusview Technologies, Llc Methods and apparatus for quality assessment of a field service operation based on dynamic assessment parameters
US20100006667A1 (en) * 2008-07-10 2010-01-14 Nielsen Steven E Marker detection mechanisms for use in marking devices and methods of using same
US20100188216A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for generating alerts on a locate device, based on comparing electronic locate information to facilities map information and/or other image information
US20110095885A9 (en) * 2008-10-02 2011-04-28 Certusview Technologies, Llc Methods and apparatus for generating electronic records of locate operations
US20100088135A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to environmental landmarks
US20100088031A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of environmental landmarks based on marking device actuations
US20100088032A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods, apparatus, and systems for generating electronic records of locate and marking operations, and combined locate and marking apparatus for same
US20100086677A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation based on marking device actuations
US20100090700A1 (en) * 2008-10-02 2010-04-15 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a locate operation based on an electronic record of locate information
US20100117654A1 (en) * 2008-10-02 2010-05-13 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a locate and/or marking operation using display layers
US20100189312A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for overlaying electronic locate information on facilities map information and/or other image information displayed on a locate device
US20100188407A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for displaying and processing facilities map information and/or other image information on a marking device
US20100189887A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Marking apparatus having enhanced features for underground facility marking operations, and associated methods and systems
US20100188088A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for displaying and processing facilities map information and/or other image information on a locate device
US20100188245A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Locate apparatus having enhanced features for underground facility locate operations, and associated methods and systems
US20100188215A1 (en) * 2008-10-02 2010-07-29 Certusview Technologies, Llc Methods and apparatus for generating alerts on a marking device, based on comparing electronic marking information to facilities map information and/or other image information
US20100085185A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating electronic records of locate operations
US20130103318A1 (en) * 2008-10-02 2013-04-25 CertusView Technologies, LLC. Systems and methods for generating electronic records of locate and marking operations
US20100085054A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Systems and methods for generating electronic records of locate and marking operations
US20100085694A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Marking device docking stations and methods of using same
US20130186333A1 (en) * 2008-10-02 2013-07-25 Steven Nielsen Methods and apparatus for overlaying electronic marking information on facilities map information and/or other image information displayed on a marking device
US20100088134A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to historical information
US20100086671A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation including service-related information and/or ticket information
US20140035587A1 (en) * 2008-10-02 2014-02-06 Certusview Technologies, Llc Methods and apparatus for generating electronic records of locate operations
US20100084532A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Marking device docking stations having mechanical docking and methods of using same
US20140074970A1 (en) * 2008-10-02 2014-03-13 Certusview Technologies, Llc Methods And Apparatus For Generating Output Data Streams Relating To Underground Utility Marking Operations
US20100085376A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a marking operation based on an electronic record of marking information
US20100088164A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to facilities maps
US20110046999A1 (en) * 2008-10-02 2011-02-24 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations by comparing locate information and marking information
US20100085701A1 (en) * 2008-10-02 2010-04-08 Certusview Technologies, Llc Marking device docking stations having security features and methods of using same
US20140122149A1 (en) * 2009-02-10 2014-05-01 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response in underground facility locate and marking operations
US20120110019A1 (en) * 2009-02-10 2012-05-03 Certusview Technologies, Llc Methods, apparatus and systems for generating limited access files for searchable electronic records of underground facility locate and/or marking operations
US20110131081A1 (en) * 2009-02-10 2011-06-02 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response in underground facility locate and marking operations
US20110035245A1 (en) * 2009-02-11 2011-02-10 Certusview Technologies, Llc Methods, apparatus, and systems for processing technician workflows for locate and/or marking operations
US20110035324A1 (en) * 2009-02-11 2011-02-10 CertusView Technologies, LLC. Methods, apparatus, and systems for generating technician workflows for locate and/or marking operations
US20110035260A1 (en) * 2009-02-11 2011-02-10 Certusview Technologies, Llc Methods, apparatus, and systems for quality assessment of locate and/or marking operations based on process guides
US20110035252A1 (en) * 2009-02-11 2011-02-10 Certusview Technologies, Llc Methods, apparatus, and systems for processing technician checklists for locate and/or marking operations
US20110035251A1 (en) * 2009-02-11 2011-02-10 Certusview Technologies, Llc Methods, apparatus, and systems for facilitating and/or verifying locate and/or marking operations
US20110035328A1 (en) * 2009-02-11 2011-02-10 Certusview Technologies, Llc Methods, apparatus, and systems for generating technician checklists for locate and/or marking operations
US20110040590A1 (en) * 2009-06-25 2011-02-17 Certusview Technologies, Llc Methods and apparatus for improving a ticket assessment system
US20110040589A1 (en) * 2009-06-25 2011-02-17 Certusview Technologies, Llc Methods and apparatus for assessing complexity of locate request tickets
US20110046994A1 (en) * 2009-06-25 2011-02-24 Certusview Technologies, Llc Methods and apparatus for multi-stage assessment of locate request tickets
US20110020776A1 (en) * 2009-06-25 2011-01-27 Certusview Technologies, Llc Locating equipment for and methods of simulating locate operations for training and/or skills evaluation
US20110022433A1 (en) * 2009-06-25 2011-01-27 Certusview Technologies, Llc Methods and apparatus for assessing locate request tickets
US20110046993A1 (en) * 2009-06-25 2011-02-24 Certusview Technologies, Llc Methods and apparatus for assessing risks associated with locate request tickets
US20120019380A1 (en) * 2009-07-07 2012-01-26 Certusview Technologies, Llc Methods, apparatus and systems for generating accuracy-annotated searchable electronic records of underground facility locate and/or marking operations
US20130147637A1 (en) * 2009-07-07 2013-06-13 Steven Nielsen Methods, apparatus and systems for generating searchable electronic records of underground facility marking operations and assessing aspects of same
US20110007076A1 (en) * 2009-07-07 2011-01-13 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US20110060496A1 (en) * 2009-08-11 2011-03-10 Certusview Technologies, Llc Systems and methods for complex event processing of vehicle information and image information relating to a vehicle
US20110093306A1 (en) * 2009-08-11 2011-04-21 Certusview Technologies, Llc Fleet management systems and methods for complex event processing of vehicle-related information via local and remote complex event processing engines
US8463487B2 (en) * 2009-08-11 2013-06-11 Certusview Technologies, Llc Systems and methods for complex event processing based on a hierarchical arrangement of complex event processing engines
US8467932B2 (en) * 2009-08-11 2013-06-18 Certusview Technologies, Llc Systems and methods for complex event processing of vehicle-related information
US8473148B2 (en) * 2009-08-11 2013-06-25 Certusview Technologies, Llc Fleet management systems and methods for complex event processing of vehicle-related information via local and remote complex event processing engines
US20110045175A1 (en) * 2009-08-20 2011-02-24 Certusview Technologies, Llc Methods and marking devices with mechanisms for indicating and/or detecting marking material color
US20110060549A1 (en) * 2009-08-20 2011-03-10 Certusview Technologies, Llc Methods and apparatus for assessing marking operations based on acceleration information
US8480332B2 (en) * 2009-09-23 2013-07-09 Certusview Technologies, Llc Laying and protecting cable into existing covering surfaces
US20110137769A1 (en) * 2009-11-05 2011-06-09 Certusview Technologies, Llc Methods, apparatus and systems for ensuring wage and hour compliance in locate operations
US20120066273A1 (en) * 2010-07-30 2012-03-15 Certusview Technologies, Llc System for and methods of automatically inserting symbols into electronic records of locate operations
US20120066137A1 (en) * 2010-07-30 2012-03-15 CertusView Technolgies, LLC System for and methods of confirming locate operation work orders with respect to municipal permits
US20120066506A1 (en) * 2010-07-30 2012-03-15 Certusview Technologies, Llc Methods, apparatus and systems for onsite linking to location-specific electronic records of locate operations
US20120036140A1 (en) * 2010-08-05 2012-02-09 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations by comparing filtered locate and/or marking information
US20120065944A1 (en) * 2010-08-11 2012-03-15 Certusview Technologies, Llc Methods, apparatus and systems for facilitating generation and assessment of engineering plans
US20120065924A1 (en) * 2010-08-13 2012-03-15 Certusview Technologies, Llc Methods, apparatus and systems for surface type detection in connection with locate and marking operations
US20120113244A1 (en) * 2010-08-13 2012-05-10 Certusview Technologies, Llc Methods, apparatus and systems for marking material color detection in connection with locate and marking operations
US20120072035A1 (en) * 2010-09-17 2012-03-22 Steven Nielsen Methods and apparatus for dispensing material and electronically tracking same
US20120069178A1 (en) * 2010-09-17 2012-03-22 Certusview Technologies, Llc Methods and apparatus for tracking motion and/or orientation of a marking device
US20130002854A1 (en) * 2010-09-17 2013-01-03 Certusview Technologies, Llc Marking methods, apparatus and systems including optical flow-based dead reckoning features
US20130004918A1 (en) * 2011-03-23 2013-01-03 Salah Huwais Fluted osteotome and surgical method for use
US20130006718A1 (en) * 2011-07-01 2013-01-03 Certusview Technologies, Llc Methods, apparatus and systems for chronicling the activities of field technicians

Cited By (143)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9086277B2 (en) 2007-03-13 2015-07-21 Certusview Technologies, Llc Electronically controlled marking apparatus and methods
US8473209B2 (en) 2007-03-13 2013-06-25 Certusview Technologies, Llc Marking apparatus and marking methods using marking dispenser with machine-readable ID mechanism
US8700325B2 (en) 2007-03-13 2014-04-15 Certusview Technologies, Llc Marking apparatus and methods for creating an electronic record of marking operations
US8775077B2 (en) 2007-03-13 2014-07-08 Certusview Technologies, Llc Systems and methods for using location data to electronically display dispensing of markers by a marking system or marking tool
US8407001B2 (en) 2007-03-13 2013-03-26 Certusview Technologies, Llc Systems and methods for using location data to electronically display dispensing of markers by a marking system or marking tool
US8401791B2 (en) 2007-03-13 2013-03-19 Certusview Technologies, Llc Methods for evaluating operation of marking apparatus
US8903643B2 (en) 2007-03-13 2014-12-02 Certusview Technologies, Llc Hand-held marking apparatus with location tracking system and methods for logging geographic location of same
US8386178B2 (en) 2007-04-04 2013-02-26 Certusview Technologies, Llc Marking system and method
US8060304B2 (en) 2007-04-04 2011-11-15 Certusview Technologies, Llc Marking system and method
US8374789B2 (en) 2007-04-04 2013-02-12 Certusview Technologies, Llc Systems and methods for using marking information to electronically display dispensing of markers by a marking system or marking tool
US9256964B2 (en) 2008-02-12 2016-02-09 Certusview Technologies, Llc Electronically documenting locate operations for underground utilities
US8194932B2 (en) 2008-02-12 2012-06-05 Certusview Technologies, Llc Ticket approval system for and method of performing quality control in field service applications
US8290204B2 (en) 2008-02-12 2012-10-16 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations
US8994749B2 (en) 2008-02-12 2015-03-31 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US8532342B2 (en) 2008-02-12 2013-09-10 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
US8532341B2 (en) 2008-02-12 2013-09-10 Certusview Technologies, Llc Electronically documenting locate operations for underground utilities
US8340359B2 (en) 2008-02-12 2012-12-25 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
US8543937B2 (en) 2008-02-12 2013-09-24 Certusview Technologies, Llc Methods and apparatus employing a reference grid for generating electronic manifests of underground facility marking operations
US8478635B2 (en) 2008-02-12 2013-07-02 Certusview Technologies, Llc Ticket approval methods of performing quality control in underground facility locate and marking operations
US8907978B2 (en) 2008-02-12 2014-12-09 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US9183646B2 (en) 2008-02-12 2015-11-10 Certusview Technologies, Llc Apparatus, systems and methods to generate electronic records of underground facility marking operations performed with GPS-enabled marking devices
US9659268B2 (en) 2008-02-12 2017-05-23 CertusVies Technologies, LLC Ticket approval system for and method of performing quality control in field service applications
US9280269B2 (en) 2008-02-12 2016-03-08 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
US8270666B2 (en) 2008-02-12 2012-09-18 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations
US8265344B2 (en) 2008-02-12 2012-09-11 Certusview Technologies, Llc Electronic manifest of underground facility locate operation
US8630463B2 (en) 2008-02-12 2014-01-14 Certusview Technologies, Llc Searchable electronic records of underground facility locate marking operations
US8416995B2 (en) 2008-02-12 2013-04-09 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
US9471835B2 (en) 2008-02-12 2016-10-18 Certusview Technologies, Llc Electronic manifest of underground facility locate marks
US8861795B2 (en) 2008-03-18 2014-10-14 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8861794B2 (en) 2008-03-18 2014-10-14 Certusview Technologies, Llc Virtual white lines for indicating planned excavation sites on electronic images
US9830338B2 (en) 2008-03-18 2017-11-28 Certusview Technologies, Inc. Virtual white lines for indicating planned excavation sites on electronic images
US8218827B2 (en) 2008-03-18 2012-07-10 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8290215B2 (en) 2008-03-18 2012-10-16 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8249306B2 (en) 2008-03-18 2012-08-21 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8280117B2 (en) 2008-03-18 2012-10-02 Certusview Technologies, Llc Virtual white lines for indicating planned excavation sites on electronic images
US8355542B2 (en) 2008-03-18 2013-01-15 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8155390B2 (en) 2008-03-18 2012-04-10 Certusview Technologies, Llc Methods and apparatus for providing unbuffered dig area indicators on aerial images to delimit planned excavation sites
US8934678B2 (en) 2008-03-18 2015-01-13 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US8300895B2 (en) 2008-03-18 2012-10-30 Certusview Technologies, Llc Virtual white lines for delimiting planned excavation sites
US9317830B2 (en) 2008-06-27 2016-04-19 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations
US9578678B2 (en) 2008-06-27 2017-02-21 Certusview Technologies, Llc Methods and apparatus for facilitating locate and marking operations
US9916588B2 (en) 2008-06-27 2018-03-13 Certusview Technologies, Llc Methods and apparatus for quality assessment of a field service operation based on dynamic assessment parameters
US9256849B2 (en) 2008-06-27 2016-02-09 Certusview Technologies, Llc Apparatus and methods for evaluating a quality of a locate operation for underground utility
US9473626B2 (en) 2008-06-27 2016-10-18 Certusview Technologies, Llc Apparatus and methods for evaluating a quality of a locate operation for underground utility
US9004004B2 (en) 2008-07-10 2015-04-14 Certusview Technologies, Llc Optical sensing methods and apparatus for detecting a color of a marking substance
US8424486B2 (en) 2008-07-10 2013-04-23 Certusview Technologies, Llc Marker detection mechanisms for use in marking devices and methods of using same
US8361543B2 (en) 2008-10-02 2013-01-29 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a marking operation based on an electronic record of marking information
US8990100B2 (en) 2008-10-02 2015-03-24 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to environmental landmarks
US8749239B2 (en) 2008-10-02 2014-06-10 Certusview Technologies, Llc Locate apparatus having enhanced features for underground facility locate operations, and associated methods and systems
US9069094B2 (en) 2008-10-02 2015-06-30 Certusview Technologies, Llc Locate transmitter configured to detect out-of-tolerance conditions in connection with underground facility locate operations, and associated methods and systems
US8766638B2 (en) 2008-10-02 2014-07-01 Certusview Technologies, Llc Locate apparatus with location tracking system for receiving environmental information regarding underground facility marking operations, and associated methods and systems
US9208458B2 (en) 2008-10-02 2015-12-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to facilities maps
US9046621B2 (en) 2008-10-02 2015-06-02 Certusview Technologies, Llc Locate apparatus configured to detect out-of-tolerance conditions in connection with underground facility locate operations, and associated methods and systems
US8577707B2 (en) 2008-10-02 2013-11-05 Certusview Technologies, Llc Methods and apparatus for overlaying electronic locate information on facilities map information and/or other image information displayed on a locate device
US8476906B2 (en) 2008-10-02 2013-07-02 Certusview Technologies, Llc Methods and apparatus for generating electronic records of locate operations
US8583264B2 (en) 2008-10-02 2013-11-12 Certusview Technologies, Llc Marking device docking stations and methods of using same
US8589201B2 (en) 2008-10-02 2013-11-19 Certusview Technologies, Llc Methods and apparatus for generating alerts on a locate device, based on comparing electronic locate information to facilities map information and/or other image information
US8478525B2 (en) 2008-10-02 2013-07-02 Certusview Technologies, Llc Methods, apparatus, and systems for analyzing use of a marking device by a technician to perform an underground facility marking operation
US8589202B2 (en) 2008-10-02 2013-11-19 Certusview Technologies, Llc Methods and apparatus for displaying and processing facilities map information and/or other image information on a marking device
US8600526B2 (en) 2008-10-02 2013-12-03 Certusview Technologies, Llc Marking device docking stations having mechanical docking and methods of using same
US8301380B2 (en) 2008-10-02 2012-10-30 Certusview Technologies, Llp Systems and methods for generating electronic records of locate and marking operations
US8612148B2 (en) 2008-10-02 2013-12-17 Certusview Technologies, Llc Marking apparatus configured to detect out-of-tolerance conditions in connection with underground facility marking operations, and associated methods and systems
US8612271B2 (en) 2008-10-02 2013-12-17 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to environmental landmarks
US8527308B2 (en) 2008-10-02 2013-09-03 Certusview Technologies, Llc Methods and apparatus for overlaying electronic locate information on facilities map information and/or other image information displayed on a locate device
US8620587B2 (en) 2008-10-02 2013-12-31 Certusview Technologies, Llc Methods, apparatus, and systems for generating electronic records of locate and marking operations, and combined locate and marking apparatus for same
US8620726B2 (en) 2008-10-02 2013-12-31 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations by comparing locate information and marking information
US9542863B2 (en) 2008-10-02 2017-01-10 Certusview Technologies, Llc Methods and apparatus for generating output data streams relating to underground utility marking operations
US8478524B2 (en) 2008-10-02 2013-07-02 Certusview Technologies, Llc Methods and apparatus for dispensing marking material in connection with underground facility marking operations based on environmental information and/or operational information
US9208464B2 (en) 2008-10-02 2015-12-08 Certusview Technologies, Llc Methods and apparatus for analyzing locate and marking operations with respect to historical information
US8467969B2 (en) 2008-10-02 2013-06-18 Certusview Technologies, Llc Marking apparatus having operational sensors for underground facility marking operations, and associated methods and systems
US8644965B2 (en) 2008-10-02 2014-02-04 Certusview Technologies, Llc Marking device docking stations having security features and methods of using same
US8965700B2 (en) 2008-10-02 2015-02-24 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of environmental landmarks based on marking device actuations
US8930836B2 (en) 2008-10-02 2015-01-06 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a locate and/or marking operation using display layers
US8731830B2 (en) 2008-10-02 2014-05-20 Certusview Technologies, Llc Marking apparatus for receiving environmental information regarding underground facility marking operations, and associated methods and systems
US8280631B2 (en) 2008-10-02 2012-10-02 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation based on marking device actuations
US8510141B2 (en) 2008-10-02 2013-08-13 Certusview Technologies, Llc Methods and apparatus for generating alerts on a marking device, based on comparing electronic marking information to facilities map information and/or other image information
US8478617B2 (en) 2008-10-02 2013-07-02 Certusview Technologies, Llc Methods and apparatus for generating alerts on a locate device, based on comparing electronic locate information to facilities map information and/or other image information
US8770140B2 (en) 2008-10-02 2014-07-08 Certusview Technologies, Llc Marking apparatus having environmental sensors and operations sensors for underground facility marking operations, and associated methods and systems
US8457893B2 (en) 2008-10-02 2013-06-04 Certusview Technologies, Llc Methods and apparatus for generating an electronic record of a marking operation including service-related information and/or ticket information
US9177403B2 (en) 2008-10-02 2015-11-03 Certusview Technologies, Llc Methods and apparatus for overlaying electronic marking information on facilities map information and/or other image information displayed on a marking device
US8400155B2 (en) 2008-10-02 2013-03-19 Certusview Technologies, Llc Methods and apparatus for displaying an electronic rendering of a locate operation based on an electronic record of locate information
US8442766B2 (en) 2008-10-02 2013-05-14 Certusview Technologies, Llc Marking apparatus having enhanced features for underground facility marking operations, and associated methods and systems
US9773217B2 (en) 2009-02-10 2017-09-26 Certusview Technologies, Llc Methods, apparatus, and systems for acquiring an enhanced positive response for underground facility locate and marking operations
US8572193B2 (en) 2009-02-10 2013-10-29 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response in underground facility locate and marking operations
US9177280B2 (en) 2009-02-10 2015-11-03 Certusview Technologies, Llc Methods, apparatus, and systems for acquiring an enhanced positive response for underground facility locate and marking operations based on an electronic manifest documenting physical locate marks on ground, pavement, or other surface
US8902251B2 (en) 2009-02-10 2014-12-02 Certusview Technologies, Llc Methods, apparatus and systems for generating limited access files for searchable electronic records of underground facility locate and/or marking operations
US8280969B2 (en) 2009-02-10 2012-10-02 Certusview Technologies, Llc Methods, apparatus and systems for requesting underground facility locate and marking operations and managing associated notifications
US8543651B2 (en) 2009-02-10 2013-09-24 Certusview Technologies, Llc Methods, apparatus and systems for submitting virtual white line drawings and managing notifications in connection with underground facility locate and marking operations
US9646353B2 (en) 2009-02-10 2017-05-09 Certusview Technologies, Llc Methods, apparatus, and systems for exchanging information between excavators and other entities associated with underground facility locate and marking operations
US8549084B2 (en) 2009-02-10 2013-10-01 Certusview Technologies, Llc Methods, apparatus, and systems for exchanging information between excavators and other entities associated with underground facility locate and marking operations
US9235821B2 (en) 2009-02-10 2016-01-12 Certusview Technologies, Llc Methods, apparatus, and systems for providing an enhanced positive response for underground facility locate and marking operations based on an electronic manifest documenting physical locate marks on ground, pavement or other surface
US8468206B2 (en) 2009-02-10 2013-06-18 Certusview Technologies, Llc Methods, apparatus and systems for notifying excavators and other entities of the status of in-progress underground facility locate and marking operations
US8484300B2 (en) 2009-02-10 2013-07-09 Certusview Technologies, Llc Methods, apparatus and systems for communicating information relating to the performance of underground facility locate and marking operations to excavators and other entities
US8566737B2 (en) 2009-02-11 2013-10-22 Certusview Technologies, Llc Virtual white lines (VWL) application for indicating an area of planned excavation
US8626571B2 (en) 2009-02-11 2014-01-07 Certusview Technologies, Llc Management system, and associated methods and apparatus, for dispatching tickets, receiving field information, and performing a quality assessment for underground facility locate and/or marking operations
US8612276B1 (en) 2009-02-11 2013-12-17 Certusview Technologies, Llc Methods, apparatus, and systems for dispatching service technicians
US9563863B2 (en) 2009-02-11 2017-02-07 Certusview Technologies, Llc Marking apparatus equipped with ticket processing software for facilitating marking operations, and associated methods
US8731999B2 (en) 2009-02-11 2014-05-20 Certusview Technologies, Llc Management system, and associated methods and apparatus, for providing improved visibility, quality control and audit capability for underground facility locate and/or marking operations
US8296308B2 (en) 2009-02-11 2012-10-23 Certusview Technologies, Llc Methods and apparatus for associating a virtual white line (VWL) image with corresponding ticket information for an excavation project
US9185176B2 (en) 2009-02-11 2015-11-10 Certusview Technologies, Llc Methods and apparatus for managing locate and/or marking operations
US8356255B2 (en) 2009-02-11 2013-01-15 Certusview Technologies, Llc Virtual white lines (VWL) for delimiting planned excavation sites of staged excavation projects
US8384742B2 (en) 2009-02-11 2013-02-26 Certusview Technologies, Llc Virtual white lines (VWL) for delimiting planned excavation sites of staged excavation projects
US8832565B2 (en) 2009-02-11 2014-09-09 Certusview Technologies, Llc Methods and apparatus for controlling access to a virtual white line (VWL) image for an excavation project
US20140347396A1 (en) * 2009-04-03 2014-11-27 Certusview Technologies, Llc Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations
US8585410B2 (en) 2009-06-25 2013-11-19 Certusview Technologies, Llc Systems for and methods of simulating facilities for use in locate operations training exercises
US9646275B2 (en) 2009-06-25 2017-05-09 Certusview Technologies, Llc Methods and apparatus for assessing risks associated with locate request tickets based on historical information
US9159107B2 (en) 2009-07-07 2015-10-13 Certusview Technologies, Llc Methods, apparatus and systems for generating location-corrected searchable electronic records of underground facility locate and/or marking operations
US8928693B2 (en) 2009-07-07 2015-01-06 Certusview Technologies, Llc Methods, apparatus and systems for generating image-processed searchable electronic records of underground facility locate and/or marking operations
US9165331B2 (en) 2009-07-07 2015-10-20 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations and assessing aspects of same
US8830265B2 (en) 2009-07-07 2014-09-09 Certusview Technologies, Llc Methods, apparatus and systems for generating searchable electronic records of underground facility marking operations and assessing aspects of same
US9189821B2 (en) 2009-07-07 2015-11-17 Certusview Technologies, Llc Methods, apparatus and systems for generating digital-media-enhanced searchable electronic records of underground facility locate and/or marking operations
US8907980B2 (en) 2009-07-07 2014-12-09 Certus View Technologies, LLC Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US8917288B2 (en) 2009-07-07 2014-12-23 Certusview Technologies, Llc Methods, apparatus and systems for generating accuracy-annotated searchable electronic records of underground facility locate and/or marking operations
US8473148B2 (en) 2009-08-11 2013-06-25 Certusview Technologies, Llc Fleet management systems and methods for complex event processing of vehicle-related information via local and remote complex event processing engines
US8467932B2 (en) 2009-08-11 2013-06-18 Certusview Technologies, Llc Systems and methods for complex event processing of vehicle-related information
US8560164B2 (en) 2009-08-11 2013-10-15 Certusview Technologies, Llc Systems and methods for complex event processing of vehicle information and image information relating to a vehicle
US8311765B2 (en) 2009-08-11 2012-11-13 Certusview Technologies, Llc Locating equipment communicatively coupled to or equipped with a mobile/portable device
US8463487B2 (en) 2009-08-11 2013-06-11 Certusview Technologies, Llc Systems and methods for complex event processing based on a hierarchical arrangement of complex event processing engines
US8620616B2 (en) 2009-08-20 2013-12-31 Certusview Technologies, Llc Methods and apparatus for assessing marking operations based on acceleration information
US9097522B2 (en) 2009-08-20 2015-08-04 Certusview Technologies, Llc Methods and marking devices with mechanisms for indicating and/or detecting marking material color
US8620572B2 (en) 2009-08-20 2013-12-31 Certusview Technologies, Llc Marking device with transmitter for triangulating location during locate operations
US8600848B2 (en) 2009-11-05 2013-12-03 Certusview Technologies, Llc Methods, apparatus and systems for ensuring wage and hour compliance in locate operations
US8583372B2 (en) 2009-12-07 2013-11-12 Certusview Technologies, Llc Methods, apparatus, and systems for facilitating compliance with marking specifications for dispensing marking material
US8805640B2 (en) 2010-01-29 2014-08-12 Certusview Technologies, Llc Locating equipment docking station communicatively coupled to or equipped with a mobile/portable device
US9696758B2 (en) 2010-01-29 2017-07-04 Certusview Technologies, Llp Locating equipment docking station communicatively coupled to or equipped with a mobile/portable device
US9311614B2 (en) 2010-07-30 2016-04-12 Certusview Technologies, Llc Methods, apparatus and systems for onsite linking to location-specific electronic records of locate operations
US8918898B2 (en) 2010-07-30 2014-12-23 Certusview Technologies, Llc Methods, apparatus and systems for onsite linking to location-specific electronic records of locate operations
US8977558B2 (en) 2010-08-11 2015-03-10 Certusview Technologies, Llc Methods, apparatus and systems for facilitating generation and assessment of engineering plans
US9046413B2 (en) 2010-08-13 2015-06-02 Certusview Technologies, Llc Methods, apparatus and systems for surface type detection in connection with locate and marking operations
US9124780B2 (en) 2010-09-17 2015-09-01 Certusview Technologies, Llc Methods and apparatus for tracking motion and/or orientation of a marking device
USD684067S1 (en) 2012-02-15 2013-06-11 Certusview Technologies, Llc Modular marking device
US9721302B2 (en) * 2012-05-24 2017-08-01 State Farm Mutual Automobile Insurance Company Server for real-time accident documentation and claim submission
US10217168B2 (en) * 2012-05-24 2019-02-26 State Farm Mutual Automobile Insurance Company Mobile computing device for real-time accident documentation and claim submission
US10387960B2 (en) * 2012-05-24 2019-08-20 State Farm Mutual Automobile Insurance Company System and method for real-time accident documentation and claim submission
US11030698B2 (en) 2012-05-24 2021-06-08 State Farm Mutual Automobile Insurance Company Server for real-time accident documentation and claim submission
US9531722B1 (en) 2013-10-31 2016-12-27 Google Inc. Methods for generating an activity stream
US9542457B1 (en) 2013-11-07 2017-01-10 Google Inc. Methods for displaying object history information
US9614880B1 (en) 2013-11-12 2017-04-04 Google Inc. Methods for real-time notifications in an activity stream
US9509772B1 (en) 2014-02-13 2016-11-29 Google Inc. Visualization and control of ongoing ingress actions
US9536199B1 (en) 2014-06-09 2017-01-03 Google Inc. Recommendations based on device usage
US9507791B2 (en) 2014-06-12 2016-11-29 Google Inc. Storage system user interface with floating file collection
US10078781B2 (en) * 2014-06-13 2018-09-18 Google Llc Automatically organizing images
US9870420B2 (en) 2015-01-19 2018-01-16 Google Llc Classification and storage of documents

Also Published As

Publication number Publication date
US20110282542A9 (en) 2011-11-17
CA2761794C (en) 2016-06-28
US20100256863A1 (en) 2010-10-07
WO2010114619A1 (en) 2010-10-07
US8612090B2 (en) 2013-12-17
CA2761794A1 (en) 2010-10-07
US20100257477A1 (en) 2010-10-07
US20140347396A1 (en) 2014-11-27
US20130116855A1 (en) 2013-05-09
US8260489B2 (en) 2012-09-04
WO2010114620A1 (en) 2010-10-07

Similar Documents

Publication Publication Date Title
US20100256981A1 (en) Methods, apparatus, and systems for documenting and reporting events via time-elapsed geo-referenced electronic drawings
US8977558B2 (en) Methods, apparatus and systems for facilitating generation and assessment of engineering plans
US8907978B2 (en) Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations
US8649610B2 (en) Methods and apparatus for auditing signage
US8902251B2 (en) Methods, apparatus and systems for generating limited access files for searchable electronic records of underground facility locate and/or marking operations
CA2821107C (en) Methods and apparatus for indicating a planned excavation
US20080170755A1 (en) Methods and apparatus for collecting media site data
WO2009126159A1 (en) Methods and apparatus for auditing signage
Kashishian et al. Digital Video Surveying
AU2013204039A1 (en) Methods, apparatus and systems for generating searchable electronic records of underground facility locate and/or marking operations

Legal Events

Date Code Title Description
AS Assignment

Owner name: CERTUSVIEW TECHNOLOGIES, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NIELSEN, STEVEN;CHAMBERS, CURTIS;FARR, JEFFREY;SIGNING DATES FROM 20100507 TO 20100514;REEL/FRAME:024602/0175

Owner name: CERTUSVIEW TECHNOLOGIES, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NIELSEN, STEVEN;CHAMBERS, CURTIS;FARR, JEFFREY;SIGNING DATES FROM 20090428 TO 20090501;REEL/FRAME:024602/0069

Owner name: CERTUSVIEW TECHNOLOGIES, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NIELSEN, STEVEN;CHAMBERS, CURTIS;FARR, JEFFREY;SIGNING DATES FROM 20090506 TO 20090507;REEL/FRAME:024602/0016

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION