US20150092048A1 - Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation - Google Patents

Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation Download PDF

Info

Publication number
US20150092048A1
US20150092048A1 US14/497,117 US201414497117A US2015092048A1 US 20150092048 A1 US20150092048 A1 US 20150092048A1 US 201414497117 A US201414497117 A US 201414497117A US 2015092048 A1 US2015092048 A1 US 2015092048A1
Authority
US
United States
Prior art keywords
mobile device
pose
target
image
ekf
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/497,117
Inventor
Christopher Brunner
Arvind Ramanandan
Mahesh Ramachandran
Abhishek Tyagi
Murali Ramaswamy Chari
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US14/497,117 priority Critical patent/US20150092048A1/en
Priority to PCT/US2014/057630 priority patent/WO2015048397A1/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TYAGI, ABHISHEK, CHARI, MURALI RAMASWAMY, RAMACHANDRAN, MAHESH, BRUNNER, CHRISTOPHER, RAMANANDAN, Arvind
Publication of US20150092048A1 publication Critical patent/US20150092048A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/383Indoor data
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1654Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with electromagnetic compass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3885Transmission of map data to client devices; Reception of map data by client devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0263Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems
    • G01S5/0264Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems at least one of the systems being a non-radio wave positioning system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0294Trajectory determination or predictive filtering, e.g. target tracking or Kalman filtering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/251Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • Camera based tracking of the mobile device's rotation and translation can be executed by a mobile device (e.g., mobile phone, tablet, heads-up display, and the like) to enable the mobile device to provide a wide variety of features, such as augmented reality and location tracking.
  • a mobile device may additionally incorporate information from sensors such as gyroscopes, accelerometers, GPS receivers, and the like.
  • sensors such as gyroscopes, accelerometers, GPS receivers, and the like.
  • sensor noise and modeling errors can cause a tracking system to “drift,” resulting in inaccurate pose determinations.
  • drift resulting in inaccurate pose determinations.
  • GPS reception is poor to non-existent indoors and inertial sensors alone cannot provide absolute pose.
  • VIT Visual Inertial Tracker
  • SLAM Simultaneous Localization And Mapping
  • EKF-SLAM Extended Kalman Filter
  • An example method of correcting drift in a tracking system of a mobile device includes obtaining location information regarding a target, obtaining an image of the target, the image captured by the mobile device, and estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information.
  • the pose comprises information indicative of a translation and orientation of the mobile device.
  • the method further comprises correcting a pose determination of the mobile device using an EKF, based, at least in part, on the measurements relating to the pose of the mobile device.
  • the example method of correcting drift in a tracking system of a mobile device can include one or more of the following features.
  • the method can include obtaining absolute coordinates of the target, where correcting the pose is further based, at least in part, on the absolute coordinates of the target.
  • the method can include processing the image of the target to determine that the target was captured in the image, where the processing includes comparing one or more keypoints of the image with one or more keypoints of each target in a plurality of known targets.
  • the method can include receiving one or more wireless signals from one or more access points, determining a proximity of the one or more access points based on wireless signals, and determining the plurality of known targets, based on the determined proximity of the one or more access points.
  • the tracking system can incorporate a Simultaneous Localization And Mapping (SLAM) system with the EKF.
  • the pose determination can be based, at least in part, on measurements from one or more of an accelerometer or a gyroscope of the mobile device.
  • the method can include determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
  • An example mobile device can include a camera, a memory, and a processing unit.
  • the processing unit is operatively coupled with the camera and the memory and configured to obtain location information regarding a target, obtain, an image of the target, the image captured by the camera of the mobile device, and estimate, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, where the pose comprises information indicative of a translation and orientation of the mobile device.
  • the processing unit is further configured to correct a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
  • EKF Extended Kalman Filter
  • the example mobile device can include one or more of the following features.
  • the processing unit can be configured to obtain absolute coordinates of the target and further configured to correct the pose is based, at least in part, on the absolute coordinates of the target.
  • the processing unit can be configured to process the image of the target to determine that the target was captured in the image, where processing the image includes comparing one or more features of the image with one or more features of each target in a plurality of known targets.
  • the mobile device may include a wireless communication interface configured to receive one or more wireless signals from one or more access points, and the processing unit can be further configured to determine a proximity of the one or more access points based on wireless signals, and determine the plurality of known targets, based on the determined proximity of the one or more access points.
  • the processing unit can be configured to incorporate a Simultaneous Localization And Mapping (SLAM) system with the EKF.
  • the mobile device can include one or more motion sensors, and the processing unit can be further configured to determine the pose determination based, at least in part, on one or more measurements received from the one or more motion sensors.
  • the one or more motion sensors can include one or more of an accelerometer or a gyroscope.
  • the processing unit can be configured to determine a bias of the one or more motion sensors, based at least in part on the pose determination and the corrected pose.
  • An example apparatus can include means for obtaining location information regarding a target, means for obtaining an image of the target, the image captured by a mobile device, and means for estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, where the pose comprises information indicative of a translation and orientation of the mobile device.
  • the example apparatus further includes means for correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
  • EKF Extended Kalman Filter
  • the example apparatus can further include one or more of the following features.
  • the apparatus can include means for obtaining absolute coordinates of the target, where the means for correcting the pose is configured to base the corrected pose, at least in part, on the absolute coordinates of the target.
  • the apparatus can include means for processing the image of the target to determine that the target was captured in the image, where the means for processing the image include means for comparing one or more features of the image with one or more features of each target in a plurality of known targets.
  • the apparatus can include means for receiving one or more wireless signals from one or more access points, means for determining a proximity of the one or more access points based on wireless signals, and means for determining the plurality of known targets, based on the determined proximity of the one or more access points.
  • the apparatus can include means for incorporating a Simultaneous Localization And Mapping (SLAM) system with the EKF.
  • the apparatus can include means for basing the pose determination, at least in part, on measurements of one or more of an accelerometer or a gyroscope of the mobile device.
  • the apparatus can include means for determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
  • SLAM Simultaneous Localization And Mapping
  • a example non-transitory machine-readable medium can have instructions embedded thereon for correcting drift in a tracking system of a mobile device.
  • the instructions include computer code for obtaining location information regarding a target, obtaining an image of the target, the image captured by the mobile device, and estimate, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, where the pose comprises information indicative of a translation and orientation of the mobile device.
  • the instructions also include computer code for correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
  • EKF Extended Kalman Filter
  • the example non-transitory machine-readable medium can further include instructions including computer code for one or more of the following features.
  • Instructions can include computer code for obtaining absolute coordinates of the target, wherein the computer code is further configured to base correcting the pose, at least in part, on the absolute coordinates of the target.
  • Instructions can include computer code for processing the image of the target to determine that the target was captured in the image, wherein the computer code for processing includes computer code for comparing one or more features of the image with one or more features of each target in a plurality of known targets.
  • Instructions can include computer code for receiving one or more wireless signals from one or more access points, determining a proximity of the one or more access points based on wireless signals, and determining the plurality of known targets, based on the determined proximity of the one or more access points.
  • Instructions can include computer code for incorporating a Simultaneous Localization And Mapping (SLAM) system with the EKF.
  • the computer code can be configured to base the pose determination, at least in part, on measurements from one or more of an accelerometer or a gyroscope of the mobile device.
  • Instructions can include computer code for determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
  • SLAM Simultaneous Localization And Mapping
  • Items and/or techniques described herein may provide one or more of the following capabilities, as well as other capabilities not mentioned.
  • Techniques can provide for the mitigation of drift in an indoor location tracking system, such as a Visual Inertial Tracker (VIT), providing for increased accuracy. This, in turn, can lead to a better user experience of applications and/or other features of a mobile device that are dependent on the indoor location tracking system.
  • VIT Visual Inertial Tracker
  • FIG. 1 is simplified image that can help illustrate how a Visual Inertial Tracker (VIT) can utilize targets for pose estimation and or correction, according to an embodiment.
  • VIT Visual Inertial Tracker
  • FIG. 2 is a block diagram of an example VIT.
  • FIG. 3 is a flow chart of a high-level process of drift correction, according to an embodiment, which can be executed by a VIT or other tracking system.
  • FIG. 4 is a flow diagram of a method of correcting drift in VIT or other tracking system, according to an embodiment.
  • FIG. 5 is a block diagram of an embodiment of a mobile device.
  • Mobile devices such as mobile phones, media players, tablets, head-mounted displays (HMDs) and other wearable electronic devices, and the like, can often execute applications and/or provide features that utilize the mobile device's translation and orientation, or “pose.”
  • Tracking the mobile device's pose in a spatial coordinate system such as the Earth-Centered, Earth-Fixed (ECEF) coordinate system can be accomplished in any of a variety of ways. Often times, this is done utilizing built-in sensors of the mobile device, such as accelerometers, gyroscopes, a Global Positioning System (GPS) receiver, and the like.
  • GPS Global Positioning System
  • a determination of the mobile device's pose can be used to enable or enhance navigation, games, and/or other applications.
  • VIT Visual Inertial Tracker
  • visual tracking systems which utilize visual sensors such as cameras
  • inertial tracking systems which utilize inertial sensors such as accelerometers and gyroscopes
  • Other sensors can be utilized as well, such as a barometer or altimeter to determine and/or correct altitude measurements. That said, where GPS coordinates are available, they may also be used to provide absolute location information to a VIT.
  • SLAM Simultaneous Localization And Mapping
  • EKF Extended Kalman Filter
  • PTAM keyframe based parallel tracking and mapping system
  • VITs Despite using both visual and inertial measurements, drift can still be a problem for such VITs. In other words, inaccuracies in these inputs can accumulate over time.
  • the drift of current state-of-the-art VIT systems is about 1% over distance. So, for example, a VIT executed on a mobile device held buy a walking user will drift 1 meter for every 100 meters the user walks.
  • Embodiments are described herein that can implement pose or provide drift correction in VITs (such as the EKF-SLAM system previously described) by providing a pose measurement derived from an image of a target with a known position. This pose measurement can be used to correct drift. Moreover, such correction can take place each time a VIT captures an image of a known target.
  • VITs such as the EKF-SLAM system previously described
  • FIG. 1 is simplified image that can help illustrate how a VIT can utilize such targets for pose correction, according to one embodiment.
  • a user can use an application executed on a mobile device 120 to track the user's position in a store. Depending on the functionality of the application, it may display information on a display 130 of the mobile device 120 , such as the user's position on a map of the store and/or where certain items may be located.
  • the application may track the user's position from information obtained by a VIT, which is also executed by the mobile device 120 .
  • the camera may capture an image 100 (e.g., a video frame) of a certain portion of the store that may have one or more targets 110 .
  • Targets 110 can be images or objects with known locations, shown in FIG. 1 as aisle signs (targets 110 - 1 and 100 - 2 at the end of aisles 1 and 2 , respectively).
  • the targets 110 include one or more keypoints recognizable by the detection algorithm that uses the keypoints to provide the pose measurement to the VIT. When one or more targets enter the camera view, an accurate pose can be determined as a result of the known location(s) of the keypoints on the target(s) 110 .
  • the image 100 reveals that the pose obtained from the keypoints on the target 110 - 1 corresponds to the pose determined by the VIT when the image 100 was taken, then no drift correction is needed. However, if there is a mismatch between the pose obtained from the keypoints on the target 110 - 1 and the pose determined by the VIT, then drift correction is needed.
  • drift correction can include resetting the VIT's pose by replacing the pose previously determined by the VIT with the pose obtained using the keypoints on the target 110 - 1 .
  • some embodiments may avoid the additional processing requirements it would take to frequently run keypoint detection by analyzing the VIT pose and VIT pose uncertainty and executing keypoint detection only if keypoints on a target are predicted to be in view.
  • a pose calculation can be calculated for each image in which a target appears, once for a given period of time and/or set of video frames, once for each series of video frames in which a target appears, and the like.
  • FIG. 2 is a block diagram of an example VIT 200 .
  • This VIT 200 employs an EKF-SLAM topology that utilizes a computer vision (CV) component 210 and an EKF component 220 .
  • CV computer vision
  • EKF component 220 These components can be executed in hardware and/or software of a mobile device 120 , an example of which is described in further detail below with regard to FIG. 5 .
  • the CV component 210 can receive images, or camera frames, from a mobile device, where the camera frames having accurate time stamps. This can enable the CV component 210 to determine when an image is captured, which can be combined or fused in the EKF with time stamped inertial sensor measurements from the same mobile device.
  • the camera frames can be captured as a series of still images and/or as part of video.
  • Embodiments utilizing video capture can, for example, receive images at 30 frames per second. Other embodiments may utilize other frame rates.
  • only a portion of frames captured by a camera may be provided to and/or utilized by the CV component.
  • Some embodiments may have CV components that utilize all frames.
  • the CV component can also receive camera calibration information.
  • Intrinsic camera calibration includes, for example, principal point, focal length at infinity, and radial distortion.
  • Extrinsic camera calibration parameters include rotation and translation with respect to inertial sensor chip. Rotation can be estimated in the VIT or assumed to be lined up with the camera. All other camera calibration parameters are very similar for a mobile device of one type. Hence, obtaining them, for example, from a certain model of mobile phone allows the calibration parameters to be applied to all mobile phones of that certain model.
  • the CV component 210 can employ any of a variety of algorithms to implement keypoint detection and keypoint tracking on the received camera frames. Keypoint detection can be based on Harris corners. Keypoint tracking can be based on Normalized Cross-Correlation (NCC).
  • the keypoint detection/tracking provides 2-D keypoint measurements in the camera frame that are relayed to the EKF component 220 .
  • the EKF component 220 utilizing sensor measurements (as detailed below) can calculate and share the predicted 2D camera coordinates of keypoints with the CV component 210 to limit the search space of the image point finder, increasing efficiency of the process.
  • the 2D camera coordinates of keypoints measured by the image point finder and provided to the EKF component 220 are ultimately used to estimate the pose of the mobile device 120 .
  • the EKF component 220 utilizes the 2-D keypoint measurements from the CV component 210 , together with sensor measurements from a gyroscope (“gyro meas” in FIG. 2 ), accelerometer (“accel meas”), and the like, to jointly estimate the three-dimensional (3D) position of the keypoints.
  • M biases on accelerometers and gyroscopes, and the gravity vector.
  • GPS measurements can also be provided to the EKF component to provide an absolute coordinate framework in which the mobile device 120 may be tracked.
  • a mobile device 120 initially may be in a location that can receive GPS measurements, and may therefore determine absolute location coordinates for the mobile device 120 .
  • the VIT 200 may determine absolute coordinates of the mobile device 120 based on the mobile device's movement relative to a position in which absolute coordinates were determined based on GPS information.
  • Embodiments can further provide, as an input to the EKF component 220 , a pose measurement derived from an image of a target.
  • the pose measurement can come from a keypoint detector, which can not only detect keypoints of a target in an image but also determine the pose of the mobile device 120 based on the image.
  • the pose measurement may be provided from the CV component 210 and/or may be derived from 2D camera coordinates. Other embodiments may provide additional and/or alternative components to provide target detection and/or pose measurement.
  • Details regarding the targets can be locally stored and/or accessible by the VIT 200 . Details can include location information such as absolute location of the target and/or its keypoints for pose calculation based on an image of the target. Additionally or alternatively, the details can include information regarding how the target may be identified.
  • the absolute pose can be determined and used by the EKF component 220 to correct for any drift that might have taken place. Correction can include, for example, overriding a pose calculated by the EKF component 220 with the newly-determined pose measurement.
  • the EKF component 220 can output various types of data. As indicated in FIG. 2 , for example, the EKF component 220 can output bias of an accelerometer, gyroscope, and/or other sensor (“accel bias gyro bias”), the determined pose of the mobile device (“pose of the phone”), 3D locations of keypoints (“3-D locations of keypoints”), and/or an estimation of the gravity vector (“gravity”). Any or all of these outputs may be influenced by the pose measurement determined from a detected target in a camera frame. The EKF component 220 seeks to minimize innovations between predicted and measured 2-D camera keypoints and can adjust inertial sensor biases, pose, gravity vector, and location of keypoints in 3-D to that end.
  • the creation of keypoints on targets for pose correction in a VIT can be done in any manner of different ways, for any manner of different applications. For instance, if a picture of the target is taken from the fronto-parallel view, keypoints can be determined using the Fast Corner algorithm, scale can be provided by measuring the distance between two of the keypoints, and descriptors can be obtained from the pixel measurements in the vicinity of the respective keypoints.
  • the placement and designation of targets for a venue can vary, depending on desired functionality. Broadly speaking, the more targets that are located in and distributed throughout a venue, the more drift correction they can provide to a VIT, providing more accurate pose determination.
  • the designation of targets and the creation of a venue map can be facilitated through an application on a mobile device. Additionally or alternatively, the data associated with the map—such as location information regarding the targets utilized to obtain pose measurements using the techniques described herein—can be collected with the designation of targets and incorporated into the map.
  • FIG. 3 is a flow chart of a high-level process of drift correction, according to one embodiment, which can be executed by a VIT or other tracking system. More specifically, means for performing one or more of the illustrated components can include hardware and/or software means described in further detail in relation to FIG. 5 , which may be logically separated into separate components, such as the components described in FIG. 2 . Some or all of the components can be executed by hardware and/or software at an operating system or device level. Other embodiments may include alterations to the embodiments shown. Components shown in FIG. 3 may be performed in a different order and/or simultaneously, according to different embodiments. Moreover, a person of ordinary skill in the art will recognize many additions, omissions, and/or other variations.
  • the process can start by receiving a camera image at block 310 .
  • the type of camera image can vary in resolution, color, and/or other characteristics, depending on desired functionality, camera hardware, and/or other factors.
  • the camera image may be a discrete still image or may be one of several frames of video captured by the camera.
  • the image may be processed to a degree before it is received by a VIT, to facilitate further image processing by the VIT.
  • the VIT optionally receives WiFi signals, which can facilitate the determination of which targets may be included in the received image.
  • WiFi signals may be utilized together with a map of a venue that includes the identity and locations of WiFi access points. If the locations of certain access points can be determined from the map, the VIT can get a rough estimate of where in the venue the VIT (and any mobile device associated therewith) is. The VIT can do this by measuring WiFi signals received from the WiFi access points by the mobile device (e.g., measuring received signal strength (RSSI), round-trip time (RTT), and/or other measurements) to determine a proximity of the access points—including which access points may be closest. This can then be compared with the map to determine a region in the venue in which the mobile device is located.
  • RSSI received signal strength
  • RTT round-trip time
  • the VIT of the mobile device can then reduce processing loads related to target detection by determining nearby targets based on the WiFi signals, at block 330 , and reducing the targets to detect to the nearby targets.
  • Such optional functionality can be beneficial, for instance, when a customer starts an application that uses the map only after having entered the venue. With no location initially, the VIT can benefit from detecting rough location from WiFi signals.
  • tight integration of GPS with VIT would allow for an initial position estimate using GPS measurements (code, Doppler, carrier phase) in addition to the regular VIT measurements listed above.
  • WiFi signals are described in the embodiment shown in FIG. 3 , additional or alternative wireless signals may be utilized.
  • the image is processed to detect targets.
  • targets can be images or objects with known locations.
  • the VIT can utilize a detection algorithm in which the image is processed to determine whether certain keypoints of the image match with keypoints of known targets by, for example, comparing the keypoints of the image with keypoints of one or more known targets (e.g., targets having keypoints and location information stored for comparison).
  • algorithm(s) used and implement a variety of detection and matching techniques from simple edge detection to the recognition of more complex patterns, symbols, and more.
  • Techniques for making the determination can vary based on the detection algorithm(s) involved.
  • detection algorithms can determine whether a target is in the image by determining whether one or more keypoints in the image match with one or more corresponding keypoints of a known target to a degree above a threshold level of certainty.
  • pose determination can utilize any of a variety of techniques to determine pose based on the known location of the target in the image, as well as information obtained from and/or associated with the image itself.
  • VIT can determine a distance and orientation of the target in relation to the mobile device (e.g., by analyzing characteristics of detected keypoints in the image, such as location, spacing, etc.), and use this information, together with the known location (and orientation) of the target, to determine a pose of the mobile device.
  • the pose from the target can be provided to an EKF of the VIT in the manner described previously, allowing the VIT to correct (e.g., adjust or replace) a posed determination (which may have been previously and/or separately determined from visual and/or inertial sensor input), based on the pose provided in the process of FIG. 3 .
  • a posed determination which may have been previously and/or separately determined from visual and/or inertial sensor input
  • FIG. 4 is a flow diagram of another, more generalized method 400 of correcting drift in VIT or other tracking system, according to one embodiment.
  • Means for performing one or more of the components of the method 400 can include hardware and/or software means described in further detail in relation to FIG. 5 , which may be logically separated into different components, such as the components described in FIG. 2 .
  • the method 400 and other techniques described herein, can be executed by hardware and/or software at an operating system or device level. Alternative embodiments may include alterations to the embodiments shown.
  • Components of the method 400 although illustrated in a particular order, may be performed in a different order and/or simultaneously, according to different embodiments.
  • a person of ordinary skill in the art will recognize many additions, omissions, and/or other variations.
  • location information regarding a target is obtained.
  • location information regarding a target can include keypoints associated with coordinates in a spatial coordinate system.
  • Corresponding descriptors of one or more targets can be associated with a map, such as a map of a location in which a VIT system is used to track a mobile device's pose.
  • this information (as well as information for other targets of a venue) can be stored on a server of the venue, and transferred to a mobile device (e.g., wirelessly via WiFi using an application executed by the mobile device) when the mobile device enters or approaches the venue.
  • image of the target is captured by the mobile device.
  • the image can be captured as part of a VIT tracking process, and may be one of a series of video frames.
  • the image may be processed to extract keypoints from the image and use one or more detection algorithms to determine whether the target is in the image. For example, algorithms may include comparing one or more keypoints extracted from the image with one or more keypoints of each target in a plurality of known targets of a venue.
  • measurements relating to a pose of the mobile device are estimated using the keypoints positioned on the targets.
  • positioned keypoints can be used to reveal the pose of the mobile device in a spatial coordinate system.
  • a pose determination of the tracking system is corrected using an EKF, based on the measurements relating to the pose of the mobile device.
  • the tracking system can use visual and inertial information to make the pose determination of the mobile device, which can be used in various applications, such as indoor navigation, augmented reality, and more.
  • the pose determination is subject to drift, it can be corrected (e.g., modified or replaced) by providing measurements estimated from the image to an EKF.
  • a pose measurement can be obtained from a target using the process of keypoint detection, keypoint matching, outlier rejection, and pose estimation as described above.
  • the pose measurement can then be provided to an EKF component to correct the pose of the mobile device.
  • a correction to a pose may not involve an EKF.
  • a customer may be able to simply to point the phone at a target to get pose to get a map and his or her position on the map without any prior and/or subsequent tracking. That is, the pose obtained from a target may provide an initial pose to a VIT in addition or as an alternative to replacing a pose previously determined by the VIT.
  • a VIT may further obtain absolute coordinates of the target, enabling the VIT to correct a determined pose of the mobile device using the keypoints of the target and absolute coordinates associated therewith.
  • FIG. 5 is a block diagram of an embodiment of a mobile device 120 , which can implement the techniques for correcting a pose determination of the tracking system, such as the method 400 shown in FIG. 4 .
  • FIG. 5 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. Moreover, system elements may be implemented in a relatively separated or relatively more integrated manner. Additionally or alternatively, some or all of the components shown in FIG. 5 can be utilized in another computing device, which can be used in conjunction with a mobile device 120 as previously described.
  • the mobile device 120 is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate).
  • the hardware elements may include a processing unit 510 which can include without limitation one or more general-purpose processors, one or more special-purpose processors (such as digital signal processors (DSPs), graphics acceleration processors, application specific integrated circuits (ASICs), and/or the like), and/or other processing structure or means, which can be configured to perform one or more of the methods described herein, including methods illustrated in FIGS. 3-4 . As shown in FIG. 5 , some embodiments may have a separate DSP 520 , depending on desired functionality.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • the mobile device 120 also can include one or more input devices 570 , which can include without limitation one or more camera(s), a touch screen, a touch pad, microphone, button(s), dial(s), switch(es), and/or the like; and one or more output devices 515 , which can include without limitation a display, light emitting diode (LED), speakers, and/or the like.
  • input devices 570 can include without limitation one or more camera(s), a touch screen, a touch pad, microphone, button(s), dial(s), switch(es), and/or the like
  • output devices 515 which can include without limitation a display, light emitting diode (LED), speakers, and/or the like.
  • the mobile device 120 might also include a wireless communication interface 530 , which can include without limitation a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset (such as a BluetoothTM device, an IEEE 502.11 device, an IEEE 502.15.4 device, a WiFi device, a WiMax device, cellular communication facilities, etc.), and/or the like.
  • the wireless communication interface 530 may permit data to be exchanged with a network, wireless access points, other computer systems, and/or any other electronic devices described herein.
  • the communication can be carried out via one or more wireless communication antenna(s) 532 that send and/or receive wireless signals 534 .
  • the wireless communication interface 530 can include separate transceivers to communicate with base transceiver stations (e.g., base transceiver stations of a cellular network) and access points.
  • base transceiver stations e.g., base transceiver stations of a cellular network
  • access points e.g., base transceiver stations of a cellular network
  • These different data networks can include, an OFDMA and/or other type of network.
  • the mobile device 120 can further include sensor(s) 540 , as previously described.
  • sensors can include, without limitation, one or more accelerometer(s), gyroscope(s), camera(s), magnetometer(s), altimeter(s), microphone(s), proximity sensor(s), light sensor(s), and the like.
  • At least a subset of the sensor(s) 540 can provide camera frames and/or inertial information used by a VIT for tracking.
  • Embodiments of the mobile device may also include a Satellite Positioning System (SPS) receiver 580 capable of receiving signals 584 from one or more SPS satellites using an SPS antenna 582 .
  • SPS Satellite Positioning System
  • Such positioning can be utilized to complement and/or be incorporated in the techniques described herein.
  • an SPS may include any combination of one or more global and/or regional navigation satellite systems and/or augmentation systems, and SPS signals may include SPS, SPS-like, and/or other signals associated with such one or more SPS.
  • GPS is an example of an SPS.
  • the mobile device 120 may further include and/or be in communication with a memory 560 .
  • the memory 560 can include, without limitation, local and/or network accessible storage, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
  • RAM random access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data structures, such as the FIFO and/or other memory utilized by the techniques described herein, and may be allocated by hardware and/or software elements of an OFDM receiver. Additionally or alternatively, data structures described herein can be implemented by a cache or other local memory of a DSP 520 or processing unit 510 .
  • Memory can further be used to store an image stack, inertial sensor data, and/or other information described herein.
  • the memory 560 of the mobile device 120 also can comprise software elements (not shown), including an operating system, device drivers, executable libraries, and/or other code, such as one or more application programs, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • software elements including an operating system, device drivers, executable libraries, and/or other code, such as one or more application programs, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • 3-4 might be implemented as code and/or instructions executable by the mobile device 120 (and/or processing unit 510 within a mobile device 120 ) and/or stored on a non-transitory and/or machine-readable storage medium (e.g., a “computer-readable storage medium,” a “machine-readable storage medium,” etc.).
  • code and/or instructions can be used to configure and/or adapt a general purpose processor (or other device) to perform one or more operations in accordance with the described methods.
  • CV application refers to a class of applications related to the acquisition, processing, analyzing, and understanding of images.
  • CV applications include, without limitation, mapping, modeling—including 3D modeling, navigation, augmented reality applications, and various other applications where images acquired from an image sensor are processed to build maps, models, and/or to derive/represent structural information about the environment from the captured images.
  • geometric information related to captured images may be used to build a map, model, and/or other representation of objects and/or other features in a physical environment.
  • Embodiments can include, for example, personal computers and/or other electronics not generally considered “mobile.”
  • a person of ordinary skill in the art will recognize many alterations to the described embodiments.
  • the term “at least one of” if used to associate a list, such as A, B, or C, can be interpreted to mean any combination of A, B, and/or C, such as A, AB, AA, AAB, AABBCCC, etc.

Abstract

A Visual Inertial Tracker (VIT), such as a Simultaneous Localization And Mapping (SLAM) system based on an Extended Kalman Filter (EKF) framework (EKF-SLAM) can provide drift correction in calculations of a pose (translation and orientation) of a mobile device by obtaining location information regarding a target, obtaining an image of the target, estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, and correcting a pose determination of the mobile device using an EKF, based, at least in part, on the measurements relating to the pose of the mobile device.

Description

  • This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/883,921, entitled “OFF-TARGET TRACKING USING FEATURE AIDING IN THE CONTEXT OF INERTIAL NAVIGATION,” filed on Sep. 27, 2013, which is hereby incorporated by reference for all purposes as if fully set forth herein.
  • BACKGROUND
  • Camera based tracking of the mobile device's rotation and translation can be executed by a mobile device (e.g., mobile phone, tablet, heads-up display, and the like) to enable the mobile device to provide a wide variety of features, such as augmented reality and location tracking. In order to more accurately track mobile device's translation and orientation—also known as the six degrees of freedom, or “pose”—a mobile device may additionally incorporate information from sensors such as gyroscopes, accelerometers, GPS receivers, and the like. However, sensor noise and modeling errors can cause a tracking system to “drift,” resulting in inaccurate pose determinations. These inaccuracies will build on each other, increasing over time, unless measures are taken to correct these pose determinations. Furthermore, GPS reception is poor to non-existent indoors and inertial sensors alone cannot provide absolute pose.
  • SUMMARY
  • A Visual Inertial Tracker (VIT), such as a Simultaneous Localization And Mapping (SLAM) system based on an Extended Kalman Filter (EKF) framework (EKF-SLAM) can provide drift correction in calculations of a pose (translation and orientation) of a mobile device.
  • An example method of correcting drift in a tracking system of a mobile device, according to the disclosure, includes obtaining location information regarding a target, obtaining an image of the target, the image captured by the mobile device, and estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information. The pose comprises information indicative of a translation and orientation of the mobile device. The method further comprises correcting a pose determination of the mobile device using an EKF, based, at least in part, on the measurements relating to the pose of the mobile device.
  • The example method of correcting drift in a tracking system of a mobile device can include one or more of the following features. The method can include obtaining absolute coordinates of the target, where correcting the pose is further based, at least in part, on the absolute coordinates of the target. The method can include processing the image of the target to determine that the target was captured in the image, where the processing includes comparing one or more keypoints of the image with one or more keypoints of each target in a plurality of known targets. The method can include receiving one or more wireless signals from one or more access points, determining a proximity of the one or more access points based on wireless signals, and determining the plurality of known targets, based on the determined proximity of the one or more access points. The tracking system can incorporate a Simultaneous Localization And Mapping (SLAM) system with the EKF. The pose determination can be based, at least in part, on measurements from one or more of an accelerometer or a gyroscope of the mobile device. The method can include determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
  • An example mobile device, according to the disclosure, can include a camera, a memory, and a processing unit. The processing unit is operatively coupled with the camera and the memory and configured to obtain location information regarding a target, obtain, an image of the target, the image captured by the camera of the mobile device, and estimate, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, where the pose comprises information indicative of a translation and orientation of the mobile device. The processing unit is further configured to correct a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
  • The example mobile device can include one or more of the following features. The processing unit can be configured to obtain absolute coordinates of the target and further configured to correct the pose is based, at least in part, on the absolute coordinates of the target. The processing unit can be configured to process the image of the target to determine that the target was captured in the image, where processing the image includes comparing one or more features of the image with one or more features of each target in a plurality of known targets. The mobile device may include a wireless communication interface configured to receive one or more wireless signals from one or more access points, and the processing unit can be further configured to determine a proximity of the one or more access points based on wireless signals, and determine the plurality of known targets, based on the determined proximity of the one or more access points. The processing unit can be configured to incorporate a Simultaneous Localization And Mapping (SLAM) system with the EKF. The mobile device can include one or more motion sensors, and the processing unit can be further configured to determine the pose determination based, at least in part, on one or more measurements received from the one or more motion sensors. The one or more motion sensors can include one or more of an accelerometer or a gyroscope. The processing unit can be configured to determine a bias of the one or more motion sensors, based at least in part on the pose determination and the corrected pose.
  • An example apparatus, according to the disclosure, can include means for obtaining location information regarding a target, means for obtaining an image of the target, the image captured by a mobile device, and means for estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, where the pose comprises information indicative of a translation and orientation of the mobile device. The example apparatus further includes means for correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
  • The example apparatus can further include one or more of the following features. The apparatus can include means for obtaining absolute coordinates of the target, where the means for correcting the pose is configured to base the corrected pose, at least in part, on the absolute coordinates of the target. The apparatus can include means for processing the image of the target to determine that the target was captured in the image, where the means for processing the image include means for comparing one or more features of the image with one or more features of each target in a plurality of known targets. The apparatus can include means for receiving one or more wireless signals from one or more access points, means for determining a proximity of the one or more access points based on wireless signals, and means for determining the plurality of known targets, based on the determined proximity of the one or more access points. The apparatus can include means for incorporating a Simultaneous Localization And Mapping (SLAM) system with the EKF. The apparatus can include means for basing the pose determination, at least in part, on measurements of one or more of an accelerometer or a gyroscope of the mobile device. The apparatus can include means for determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
  • A example non-transitory machine-readable medium, according to the disclosure, can have instructions embedded thereon for correcting drift in a tracking system of a mobile device. The instructions include computer code for obtaining location information regarding a target, obtaining an image of the target, the image captured by the mobile device, and estimate, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, where the pose comprises information indicative of a translation and orientation of the mobile device. The instructions also include computer code for correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
  • The example non-transitory machine-readable medium can further include instructions including computer code for one or more of the following features. Instructions can include computer code for obtaining absolute coordinates of the target, wherein the computer code is further configured to base correcting the pose, at least in part, on the absolute coordinates of the target. Instructions can include computer code for processing the image of the target to determine that the target was captured in the image, wherein the computer code for processing includes computer code for comparing one or more features of the image with one or more features of each target in a plurality of known targets. Instructions can include computer code for receiving one or more wireless signals from one or more access points, determining a proximity of the one or more access points based on wireless signals, and determining the plurality of known targets, based on the determined proximity of the one or more access points. Instructions can include computer code for incorporating a Simultaneous Localization And Mapping (SLAM) system with the EKF. The computer code can be configured to base the pose determination, at least in part, on measurements from one or more of an accelerometer or a gyroscope of the mobile device. Instructions can include computer code for determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
  • Items and/or techniques described herein may provide one or more of the following capabilities, as well as other capabilities not mentioned. Techniques can provide for the mitigation of drift in an indoor location tracking system, such as a Visual Inertial Tracker (VIT), providing for increased accuracy. This, in turn, can lead to a better user experience of applications and/or other features of a mobile device that are dependent on the indoor location tracking system. These and other advantages and features are described in more detail in conjunction with the text below and attached figures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • An understanding of the nature and advantages of various embodiments may be realized by reference to the following figures. In the appended figures, similar components or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a dash and a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
  • FIG. 1 is simplified image that can help illustrate how a Visual Inertial Tracker (VIT) can utilize targets for pose estimation and or correction, according to an embodiment.
  • FIG. 2 is a block diagram of an example VIT.
  • FIG. 3 is a flow chart of a high-level process of drift correction, according to an embodiment, which can be executed by a VIT or other tracking system.
  • FIG. 4 is a flow diagram of a method of correcting drift in VIT or other tracking system, according to an embodiment.
  • FIG. 5 is a block diagram of an embodiment of a mobile device.
  • DETAILED DESCRIPTION
  • The detailed description set forth below in connection with the appended drawings is intended as a description of various aspects of the present disclosure and is not intended to represent the only aspects in which the present disclosure may be practiced. Each aspect described in this disclosure is provided merely as an example or illustration of the present disclosure, and should not necessarily be construed as preferred or advantageous over other aspects. The detailed description includes specific details for the purpose of providing a thorough understanding of the present disclosure. However, it will be apparent to those skilled in the art that the present disclosure may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form in order to avoid obscuring the concepts of the present disclosure. Acronyms and other descriptive terminology may be used merely for convenience and clarity and are not intended to limit the scope of the disclosure.
  • Mobile devices, such as mobile phones, media players, tablets, head-mounted displays (HMDs) and other wearable electronic devices, and the like, can often execute applications and/or provide features that utilize the mobile device's translation and orientation, or “pose.” Tracking the mobile device's pose in a spatial coordinate system such as the Earth-Centered, Earth-Fixed (ECEF) coordinate system can be accomplished in any of a variety of ways. Often times, this is done utilizing built-in sensors of the mobile device, such as accelerometers, gyroscopes, a Global Positioning System (GPS) receiver, and the like. A determination of the mobile device's pose can be used to enable or enhance navigation, games, and/or other applications.
  • Where GPS positioning is unavailable or unreliable, such as in indoor environments, the tracking of a mobile device's pose can be done by a Visual Inertial Tracker (VIT), which can combine measurements from visual tracking systems (which utilize visual sensors such as cameras) with measurements from inertial tracking systems (which utilize inertial sensors such as accelerometers and gyroscopes). Other sensors can be utilized as well, such as a barometer or altimeter to determine and/or correct altitude measurements. That said, where GPS coordinates are available, they may also be used to provide absolute location information to a VIT. One such embodiment of a VIT incorporates a Simultaneous Localization And Mapping (SLAM) system based on an Extended Kalman Filter (EKF) framework: the EKF receives various measurements from different sensors as listed above to track the pose of a phone. This type of system is referred to herein as an EKF-SLAM system. A keyframe based parallel tracking and mapping system (“PTAM”) is another example of a SLAM system.
  • Despite using both visual and inertial measurements, drift can still be a problem for such VITs. In other words, inaccuracies in these inputs can accumulate over time. The drift of current state-of-the-art VIT systems is about 1% over distance. So, for example, a VIT executed on a mobile device held buy a walking user will drift 1 meter for every 100 meters the user walks.
  • Embodiments are described herein that can implement pose or provide drift correction in VITs (such as the EKF-SLAM system previously described) by providing a pose measurement derived from an image of a target with a known position. This pose measurement can be used to correct drift. Moreover, such correction can take place each time a VIT captures an image of a known target.
  • FIG. 1 is simplified image that can help illustrate how a VIT can utilize such targets for pose correction, according to one embodiment. In this embodiment, a user can use an application executed on a mobile device 120 to track the user's position in a store. Depending on the functionality of the application, it may display information on a display 130 of the mobile device 120, such as the user's position on a map of the store and/or where certain items may be located.
  • The application may track the user's position from information obtained by a VIT, which is also executed by the mobile device 120. As part of the tracking process, the camera may capture an image 100 (e.g., a video frame) of a certain portion of the store that may have one or more targets 110. Targets 110 can be images or objects with known locations, shown in FIG. 1 as aisle signs (targets 110-1 and 100-2 at the end of aisles 1 and 2, respectively). The targets 110 include one or more keypoints recognizable by the detection algorithm that uses the keypoints to provide the pose measurement to the VIT. When one or more targets enter the camera view, an accurate pose can be determined as a result of the known location(s) of the keypoints on the target(s) 110. This can happen in 4 steps: keypoint detection, keypoint matching, outlier rejection, pose estimation based on minimization of reprojection error. Example implementations are described in “Real-Time Detection and Tracking for Augmented Reality on Mobile Phones” by Daniel Wagner et al in IEEE Transactions on Visualization and Computer Graphics, 2009, found at http://dl.acm.org/citation.cfm?id-1605359, which is incorporated by reference herein. The pose in the VIT can then be replaced or updated by the pose obtained from the keypoints on the target to reduce drift. (Alternatively, the pose may be initialized the VIT if this is the first absolute pose measurement.)
  • As an example, if the image 100 reveals that the pose obtained from the keypoints on the target 110-1 corresponds to the pose determined by the VIT when the image 100 was taken, then no drift correction is needed. However, if there is a mismatch between the pose obtained from the keypoints on the target 110-1 and the pose determined by the VIT, then drift correction is needed.
  • In some embodiments, drift correction can include resetting the VIT's pose by replacing the pose previously determined by the VIT with the pose obtained using the keypoints on the target 110-1. Moreover, some embodiments may avoid the additional processing requirements it would take to frequently run keypoint detection by analyzing the VIT pose and VIT pose uncertainty and executing keypoint detection only if keypoints on a target are predicted to be in view.
  • Of course, different embodiments may choose to perform such drift correction differently, depending on desired functionality, processing considerations, and other factors. For example, a pose calculation can be calculated for each image in which a target appears, once for a given period of time and/or set of video frames, once for each series of video frames in which a target appears, and the like. A person of ordinary skill in the art will recognize many additional variations.
  • FIG. 2 is a block diagram of an example VIT 200. This VIT 200 employs an EKF-SLAM topology that utilizes a computer vision (CV) component 210 and an EKF component 220. These components can be executed in hardware and/or software of a mobile device 120, an example of which is described in further detail below with regard to FIG. 5.
  • As illustrated in FIG. 2, the CV component 210 can receive images, or camera frames, from a mobile device, where the camera frames having accurate time stamps. This can enable the CV component 210 to determine when an image is captured, which can be combined or fused in the EKF with time stamped inertial sensor measurements from the same mobile device. Depending on desired functionality, the camera frames can be captured as a series of still images and/or as part of video. Embodiments utilizing video capture can, for example, receive images at 30 frames per second. Other embodiments may utilize other frame rates. In some embodiments, only a portion of frames captured by a camera may be provided to and/or utilized by the CV component. Some embodiments may have CV components that utilize all frames.
  • The CV component can also receive camera calibration information. Intrinsic camera calibration includes, for example, principal point, focal length at infinity, and radial distortion. Extrinsic camera calibration parameters include rotation and translation with respect to inertial sensor chip. Rotation can be estimated in the VIT or assumed to be lined up with the camera. All other camera calibration parameters are very similar for a mobile device of one type. Hence, obtaining them, for example, from a certain model of mobile phone allows the calibration parameters to be applied to all mobile phones of that certain model.
  • Using the camera frames and camera calibration, the CV component 210 can employ any of a variety of algorithms to implement keypoint detection and keypoint tracking on the received camera frames. Keypoint detection can be based on Harris corners. Keypoint tracking can be based on Normalized Cross-Correlation (NCC). The keypoint detection/tracking provides 2-D keypoint measurements in the camera frame that are relayed to the EKF component 220. The EKF component 220, utilizing sensor measurements (as detailed below) can calculate and share the predicted 2D camera coordinates of keypoints with the CV component 210 to limit the search space of the image point finder, increasing efficiency of the process. The 2D camera coordinates of keypoints measured by the image point finder and provided to the EKF component 220 are ultimately used to estimate the pose of the mobile device 120.
  • In addition to the VIT pose, the EKF component 220 utilizes the 2-D keypoint measurements from the CV component 210, together with sensor measurements from a gyroscope (“gyro meas” in FIG. 2), accelerometer (“accel meas”), and the like, to jointly estimate the three-dimensional (3D) position of the keypoints. M, biases on accelerometers and gyroscopes, and the gravity vector. For more information on an EKF-SLAM implementation, see Jones, Eagle S., and Stefano Soatto, “Visual-inertial navigation, mapping and localization: A scalable real-time causal approach.” The International Journal of Robotics Research 30.4 (2011): 407-430, which is incorporated by reference herein in its entirety.
  • GPS measurements (“GPS meas”) can also be provided to the EKF component to provide an absolute coordinate framework in which the mobile device 120 may be tracked. For example, a mobile device 120 initially may be in a location that can receive GPS measurements, and may therefore determine absolute location coordinates for the mobile device 120. As the mobile device moves to a location in which GPS measurements are not received (e.g., indoors), the VIT 200 may determine absolute coordinates of the mobile device 120 based on the mobile device's movement relative to a position in which absolute coordinates were determined based on GPS information.
  • Embodiments can further provide, as an input to the EKF component 220, a pose measurement derived from an image of a target. As described above, the pose measurement can come from a keypoint detector, which can not only detect keypoints of a target in an image but also determine the pose of the mobile device 120 based on the image. In some embodiments, the pose measurement may be provided from the CV component 210 and/or may be derived from 2D camera coordinates. Other embodiments may provide additional and/or alternative components to provide target detection and/or pose measurement.
  • Details regarding the targets can be locally stored and/or accessible by the VIT 200. Details can include location information such as absolute location of the target and/or its keypoints for pose calculation based on an image of the target. Additionally or alternatively, the details can include information regarding how the target may be identified.
  • By being able to determine pose in relation to the target and by knowing the absolute coordinates of the target (and/or one or more keypoints of the target), the absolute pose can be determined and used by the EKF component 220 to correct for any drift that might have taken place. Correction can include, for example, overriding a pose calculated by the EKF component 220 with the newly-determined pose measurement.
  • Depending on desired functionality, the EKF component 220 can output various types of data. As indicated in FIG. 2, for example, the EKF component 220 can output bias of an accelerometer, gyroscope, and/or other sensor (“accel bias gyro bias”), the determined pose of the mobile device (“pose of the phone”), 3D locations of keypoints (“3-D locations of keypoints”), and/or an estimation of the gravity vector (“gravity”). Any or all of these outputs may be influenced by the pose measurement determined from a detected target in a camera frame. The EKF component 220 seeks to minimize innovations between predicted and measured 2-D camera keypoints and can adjust inertial sensor biases, pose, gravity vector, and location of keypoints in 3-D to that end.
  • The creation of keypoints on targets for pose correction in a VIT can be done in any manner of different ways, for any manner of different applications. For instance, if a picture of the target is taken from the fronto-parallel view, keypoints can be determined using the Fast Corner algorithm, scale can be provided by measuring the distance between two of the keypoints, and descriptors can be obtained from the pixel measurements in the vicinity of the respective keypoints. The placement and designation of targets for a venue can vary, depending on desired functionality. Broadly speaking, the more targets that are located in and distributed throughout a venue, the more drift correction they can provide to a VIT, providing more accurate pose determination. The designation of targets and the creation of a venue map can be facilitated through an application on a mobile device. Additionally or alternatively, the data associated with the map—such as location information regarding the targets utilized to obtain pose measurements using the techniques described herein—can be collected with the designation of targets and incorporated into the map.
  • FIG. 3 is a flow chart of a high-level process of drift correction, according to one embodiment, which can be executed by a VIT or other tracking system. More specifically, means for performing one or more of the illustrated components can include hardware and/or software means described in further detail in relation to FIG. 5, which may be logically separated into separate components, such as the components described in FIG. 2. Some or all of the components can be executed by hardware and/or software at an operating system or device level. Other embodiments may include alterations to the embodiments shown. Components shown in FIG. 3 may be performed in a different order and/or simultaneously, according to different embodiments. Moreover, a person of ordinary skill in the art will recognize many additions, omissions, and/or other variations.
  • The process can start by receiving a camera image at block 310. The type of camera image can vary in resolution, color, and/or other characteristics, depending on desired functionality, camera hardware, and/or other factors. Moreover, the camera image may be a discrete still image or may be one of several frames of video captured by the camera. In some embodiments, the image may be processed to a degree before it is received by a VIT, to facilitate further image processing by the VIT.
  • At block 320, the VIT optionally receives WiFi signals, which can facilitate the determination of which targets may be included in the received image. For example, wireless signals may be utilized together with a map of a venue that includes the identity and locations of WiFi access points. If the locations of certain access points can be determined from the map, the VIT can get a rough estimate of where in the venue the VIT (and any mobile device associated therewith) is. The VIT can do this by measuring WiFi signals received from the WiFi access points by the mobile device (e.g., measuring received signal strength (RSSI), round-trip time (RTT), and/or other measurements) to determine a proximity of the access points—including which access points may be closest. This can then be compared with the map to determine a region in the venue in which the mobile device is located.
  • The VIT of the mobile device can then reduce processing loads related to target detection by determining nearby targets based on the WiFi signals, at block 330, and reducing the targets to detect to the nearby targets. Such optional functionality can be beneficial, for instance, when a customer starts an application that uses the map only after having entered the venue. With no location initially, the VIT can benefit from detecting rough location from WiFi signals. Furthermore, tight integration of GPS with VIT would allow for an initial position estimate using GPS measurements (code, Doppler, carrier phase) in addition to the regular VIT measurements listed above.
  • It will be understood that, although WiFi signals are described in the embodiment shown in FIG. 3, additional or alternative wireless signals may be utilized.
  • At block 340, the image is processed to detect targets. As previously described, targets can be images or objects with known locations. Thus, the VIT can utilize a detection algorithm in which the image is processed to determine whether certain keypoints of the image match with keypoints of known targets by, for example, comparing the keypoints of the image with keypoints of one or more known targets (e.g., targets having keypoints and location information stored for comparison). Depending on algorithm(s) used and implement a variety of detection and matching techniques, from simple edge detection to the recognition of more complex patterns, symbols, and more.
  • At block 350, a determination is made of whether a target is in the image. Techniques for making the determination can vary based on the detection algorithm(s) involved. In some embodiments, detection algorithms can determine whether a target is in the image by determining whether one or more keypoints in the image match with one or more corresponding keypoints of a known target to a degree above a threshold level of certainty.
  • If a target is not determined to be in an image, the process ends (potentially restarting with the receipt of a new image). However, if a target is determined to be in the image, a pose is determined based on the camera image, at block 360. As explained above with regard to FIG. 2, pose determination can utilize any of a variety of techniques to determine pose based on the known location of the target in the image, as well as information obtained from and/or associated with the image itself. For example, VIT can determine a distance and orientation of the target in relation to the mobile device (e.g., by analyzing characteristics of detected keypoints in the image, such as location, spacing, etc.), and use this information, together with the known location (and orientation) of the target, to determine a pose of the mobile device.
  • At block 370, the pose from the target can be provided to an EKF of the VIT in the manner described previously, allowing the VIT to correct (e.g., adjust or replace) a posed determination (which may have been previously and/or separately determined from visual and/or inertial sensor input), based on the pose provided in the process of FIG. 3.
  • FIG. 4 is a flow diagram of another, more generalized method 400 of correcting drift in VIT or other tracking system, according to one embodiment. Means for performing one or more of the components of the method 400 can include hardware and/or software means described in further detail in relation to FIG. 5, which may be logically separated into different components, such as the components described in FIG. 2. The method 400, and other techniques described herein, can be executed by hardware and/or software at an operating system or device level. Alternative embodiments may include alterations to the embodiments shown. Components of the method 400, although illustrated in a particular order, may be performed in a different order and/or simultaneously, according to different embodiments. Moreover, a person of ordinary skill in the art will recognize many additions, omissions, and/or other variations.
  • At block 410, location information regarding a target is obtained. As indicated previously, location information regarding a target can include keypoints associated with coordinates in a spatial coordinate system. Corresponding descriptors of one or more targets can be associated with a map, such as a map of a location in which a VIT system is used to track a mobile device's pose. In some embodiments, this information (as well as information for other targets of a venue) can be stored on a server of the venue, and transferred to a mobile device (e.g., wirelessly via WiFi using an application executed by the mobile device) when the mobile device enters or approaches the venue.
  • At block 420 image of the target is captured by the mobile device. The image can be captured as part of a VIT tracking process, and may be one of a series of video frames. As previously described, the image may be processed to extract keypoints from the image and use one or more detection algorithms to determine whether the target is in the image. For example, algorithms may include comparing one or more keypoints extracted from the image with one or more keypoints of each target in a plurality of known targets of a venue.
  • At block 430 measurements relating to a pose of the mobile device are estimated using the keypoints positioned on the targets. As previously indicated, positioned keypoints can be used to reveal the pose of the mobile device in a spatial coordinate system.
  • At block 440, a pose determination of the tracking system is corrected using an EKF, based on the measurements relating to the pose of the mobile device. As described above, the tracking system can use visual and inertial information to make the pose determination of the mobile device, which can be used in various applications, such as indoor navigation, augmented reality, and more. Because the pose determination is subject to drift, it can be corrected (e.g., modified or replaced) by providing measurements estimated from the image to an EKF. For example, a pose measurement can be obtained from a target using the process of keypoint detection, keypoint matching, outlier rejection, and pose estimation as described above. The pose measurement can then be provided to an EKF component to correct the pose of the mobile device. In alternative embodiments, a correction to a pose may not involve an EKF.
  • Depending on desired functionality, different embodiments may implement variations on the method 400 of correcting drift in VIT illustrated in FIG. 4. For example, in one implementation, a customer may be able to simply to point the phone at a target to get pose to get a map and his or her position on the map without any prior and/or subsequent tracking. That is, the pose obtained from a target may provide an initial pose to a VIT in addition or as an alternative to replacing a pose previously determined by the VIT. In some embodiments, a VIT may further obtain absolute coordinates of the target, enabling the VIT to correct a determined pose of the mobile device using the keypoints of the target and absolute coordinates associated therewith. A person of ordinary skill in the art will recognize many additional variations.
  • FIG. 5 is a block diagram of an embodiment of a mobile device 120, which can implement the techniques for correcting a pose determination of the tracking system, such as the method 400 shown in FIG. 4. It should be noted that FIG. 5 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. Moreover, system elements may be implemented in a relatively separated or relatively more integrated manner. Additionally or alternatively, some or all of the components shown in FIG. 5 can be utilized in another computing device, which can be used in conjunction with a mobile device 120 as previously described.
  • The mobile device 120 is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate). The hardware elements may include a processing unit 510 which can include without limitation one or more general-purpose processors, one or more special-purpose processors (such as digital signal processors (DSPs), graphics acceleration processors, application specific integrated circuits (ASICs), and/or the like), and/or other processing structure or means, which can be configured to perform one or more of the methods described herein, including methods illustrated in FIGS. 3-4. As shown in FIG. 5, some embodiments may have a separate DSP 520, depending on desired functionality. The mobile device 120 also can include one or more input devices 570, which can include without limitation one or more camera(s), a touch screen, a touch pad, microphone, button(s), dial(s), switch(es), and/or the like; and one or more output devices 515, which can include without limitation a display, light emitting diode (LED), speakers, and/or the like.
  • The mobile device 120 might also include a wireless communication interface 530, which can include without limitation a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset (such as a Bluetooth™ device, an IEEE 502.11 device, an IEEE 502.15.4 device, a WiFi device, a WiMax device, cellular communication facilities, etc.), and/or the like. The wireless communication interface 530 may permit data to be exchanged with a network, wireless access points, other computer systems, and/or any other electronic devices described herein. The communication can be carried out via one or more wireless communication antenna(s) 532 that send and/or receive wireless signals 534.
  • Depending on desired functionality, the wireless communication interface 530 can include separate transceivers to communicate with base transceiver stations (e.g., base transceiver stations of a cellular network) and access points. These different data networks can include, an OFDMA and/or other type of network.
  • The mobile device 120 can further include sensor(s) 540, as previously described. Such sensors can include, without limitation, one or more accelerometer(s), gyroscope(s), camera(s), magnetometer(s), altimeter(s), microphone(s), proximity sensor(s), light sensor(s), and the like. At least a subset of the sensor(s) 540 can provide camera frames and/or inertial information used by a VIT for tracking.
  • Embodiments of the mobile device may also include a Satellite Positioning System (SPS) receiver 580 capable of receiving signals 584 from one or more SPS satellites using an SPS antenna 582. Such positioning can be utilized to complement and/or be incorporated in the techniques described herein. It can be noted that, as used herein, an SPS may include any combination of one or more global and/or regional navigation satellite systems and/or augmentation systems, and SPS signals may include SPS, SPS-like, and/or other signals associated with such one or more SPS. GPS is an example of an SPS.
  • The mobile device 120 may further include and/or be in communication with a memory 560. The memory 560 can include, without limitation, local and/or network accessible storage, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data structures, such as the FIFO and/or other memory utilized by the techniques described herein, and may be allocated by hardware and/or software elements of an OFDM receiver. Additionally or alternatively, data structures described herein can be implemented by a cache or other local memory of a DSP 520 or processing unit 510. Memory can further be used to store an image stack, inertial sensor data, and/or other information described herein.
  • The memory 560 of the mobile device 120 also can comprise software elements (not shown), including an operating system, device drivers, executable libraries, and/or other code, such as one or more application programs, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above, such as the methods illustrated in FIGS. 3-4, might be implemented as code and/or instructions executable by the mobile device 120 (and/or processing unit 510 within a mobile device 120) and/or stored on a non-transitory and/or machine-readable storage medium (e.g., a “computer-readable storage medium,” a “machine-readable storage medium,” etc.). In an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose processor (or other device) to perform one or more operations in accordance with the described methods.
  • It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
  • The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
  • The term Computer Vision (CV) application as used herein refers to a class of applications related to the acquisition, processing, analyzing, and understanding of images. CV applications include, without limitation, mapping, modeling—including 3D modeling, navigation, augmented reality applications, and various other applications where images acquired from an image sensor are processed to build maps, models, and/or to derive/represent structural information about the environment from the captured images. In many CV applications, geometric information related to captured images may be used to build a map, model, and/or other representation of objects and/or other features in a physical environment.
  • It can be further noted that, although examples described herein are implemented by a mobile device, embodiments are not so limited. Embodiments can include, for example, personal computers and/or other electronics not generally considered “mobile.” A person of ordinary skill in the art will recognize many alterations to the described embodiments.
  • Terms, “and” and “or” as used herein, may include a variety of meanings that also is expected to depend at least in part upon the context in which such terms are used. Typically, “or” if used to associate a list, such as A, B, or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B, or C, here used in the exclusive sense. In addition, the term “one or more” as used herein may be used to describe any feature, structure, or characteristic in the singular or may be used to describe some combination of features, structures, or characteristics. However, it should be noted that this is merely an illustrative example and claimed subject matter is not limited to this example. Furthermore, the term “at least one of” if used to associate a list, such as A, B, or C, can be interpreted to mean any combination of A, B, and/or C, such as A, AB, AA, AAB, AABBCCC, etc.
  • Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of embodiments. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bound the scope of the claims.

Claims (29)

What is claimed is:
1. A method of correcting drift in a tracking system of a mobile device, the method comprising:
obtaining location information regarding a target;
obtaining an image of the target, the image captured by the mobile device;
estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, wherein the pose comprises information indicative of a translation and orientation of the mobile device; and
correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
2. The method of claim 1, further comprising obtaining absolute coordinates of the target, wherein correcting the pose is further based, at least in part, on the absolute coordinates of the target.
3. The method of claim 1, further comprising processing the image of the target to determine that the target was captured in the image, wherein the processing includes comparing one or more keypoints of the image with one or more keypoints of each target in a plurality of known targets.
4. The method of claim 3, further comprising:
receiving one or more wireless signals from one or more access points;
determining a proximity of the one or more access points based on wireless signals; and
determining the plurality of known targets, based on the determined proximity of the one or more access points.
5. The method of claim 1, wherein the tracking system incorporates a Simultaneous Localization And Mapping (SLAM) system with the EKF.
6. The method of claim 1, wherein the pose determination is based, at least in part, on measurements from one or more of an accelerometer or a gyroscope of the mobile device.
7. The method of claim 6, further comprising determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
8. A mobile device comprising:
a camera;
a memory; and
a processing unit operatively coupled with the camera and the memory and configured to:
obtain location information regarding a target;
obtain, an image of the target, the image captured by the camera of the mobile device;
estimate, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, wherein the pose comprises information indicative of a translation and orientation of the mobile device; and
correct a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
9. The mobile device of claim 8, wherein the processing unit is configured to obtain absolute coordinates of the target, wherein the processing unit is further configured to correct the pose is based, at least in part, on the absolute coordinates of the target.
10. The mobile device of claim 8, wherein the processing unit is configured to process the image of the target to determine that the target was captured in the image, and wherein processing the image includes comparing one or more features of the image with one or more features of each target in a plurality of known targets.
11. The mobile device of claim 10, further comprising a wireless communication interface configured to receive one or more wireless signals from one or more access points, wherein the processing unit is further configured to:
determine a proximity of the one or more access points based on wireless signals; and
determine the plurality of known targets, based on the determined proximity of the one or more access points.
12. The mobile device of claim 8, wherein the processing unit is configured to incorporate a Simultaneous Localization And Mapping (SLAM) system with the EKF.
13. The mobile device of claim 8, further comprising one or more motion sensors, wherein the processing unit is further configured to determine the pose determination based, at least in part, on one or more measurements received from the one or more motion sensors.
14. The mobile device of claim 13, wherein the one or more motion sensors include one or more of an accelerometer or a gyroscope.
15. The mobile device of claim 13, wherein the processing unit is configured to determine a bias of the one or more motion sensors, based at least in part on the pose determination and the corrected pose.
16. An apparatus comprising:
means for obtaining location information regarding a target;
means for obtaining an image of the target, the image captured by a mobile device;
means for estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, wherein the pose comprises information indicative of a translation and orientation of the mobile device; and
means for correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
17. The apparatus of claim 16, further comprising means for obtaining absolute coordinates of the target, wherein the means for correcting the pose is configured to base the corrected pose, at least in part, on the absolute coordinates of the target.
18. The apparatus of claim 16, further comprising means for processing the image of the target to determine that the target was captured in the image, wherein the means for processing the image include means for comparing one or more features of the image with one or more features of each target in a plurality of known targets.
19. The apparatus of claim 18, further comprising:
means for receiving one or more wireless signals from one or more access points;
means for determining a proximity of the one or more access points based on wireless signals; and
means for determining the plurality of known targets, based on the determined proximity of the one or more access points.
20. The apparatus of claim 16, further comprising means for incorporating a Simultaneous Localization And Mapping (SLAM) system with the EKF.
21. The apparatus of claim 16, further comprising means for basing the pose determination, at least in part, on measurements of one or more of an accelerometer or a gyroscope of the mobile device.
22. The apparatus of claim 21, further comprising means for determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
23. A non-transitory machine-readable medium having instructions embedded thereon for correcting drift in a tracking system of a mobile device, the instructions including computer code for:
obtaining location information regarding a target;
obtaining an image of the target, the image captured by the mobile device;
estimating, from the image of the target, measurements relating to a pose of the mobile device based on the image and location information, wherein the pose comprises information indicative of a translation and orientation of the mobile device; and
correcting a pose determination of the mobile device using an Extended Kalman Filter (EKF), based, at least in part, on the measurements relating to the pose of the mobile device.
24. The non-transitory machine-readable medium of claim 23, the instructions further including computer code for obtaining absolute coordinates of the target, wherein the computer code is further configured to base correcting the pose, at least in part, on the absolute coordinates of the target.
25. The non-transitory machine-readable medium of claim 23, the instructions further including computer code for processing the image of the target to determine that the target was captured in the image, wherein the computer code for processing includes computer code for comparing one or more features of the image with one or more features of each target in a plurality of known targets.
26. The non-transitory machine-readable medium of claim 25, the instructions further including computer code for:
receiving one or more wireless signals from one or more access points;
determining a proximity of the one or more access points based on wireless signals; and
determining the plurality of known targets, based on the determined proximity of the one or more access points.
27. The non-transitory machine-readable medium of claim 23, the instructions further including computer code for incorporating a Simultaneous Localization And Mapping (SLAM) system with the EKF.
28. The non-transitory machine-readable medium of claim 23, wherein the computer code can be configured to base the pose determination, at least in part, on measurements from one or more of an accelerometer or a gyroscope of the mobile device.
29. The non-transitory machine-readable medium of claim 28, the instructions further including computer code for determining a bias of one or more of the accelerometer or the gyroscope of the mobile device, based at least in part on the pose determination and the corrected pose.
US14/497,117 2013-09-27 2014-09-25 Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation Abandoned US20150092048A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/497,117 US20150092048A1 (en) 2013-09-27 2014-09-25 Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation
PCT/US2014/057630 WO2015048397A1 (en) 2013-09-27 2014-09-26 Off-target tracking using feature aiding in the context of inertial navigation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361883921P 2013-09-27 2013-09-27
US14/497,117 US20150092048A1 (en) 2013-09-27 2014-09-25 Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation

Publications (1)

Publication Number Publication Date
US20150092048A1 true US20150092048A1 (en) 2015-04-02

Family

ID=52739773

Family Applications (4)

Application Number Title Priority Date Filing Date
US14/497,117 Abandoned US20150092048A1 (en) 2013-09-27 2014-09-25 Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation
US14/497,235 Active US9405972B2 (en) 2013-09-27 2014-09-25 Exterior hybrid photo mapping
US14/497,219 Active US9400930B2 (en) 2013-09-27 2014-09-25 Hybrid photo navigation and mapping
US15/195,921 Active 2034-11-09 US9947100B2 (en) 2013-09-27 2016-06-28 Exterior hybrid photo mapping

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/497,235 Active US9405972B2 (en) 2013-09-27 2014-09-25 Exterior hybrid photo mapping
US14/497,219 Active US9400930B2 (en) 2013-09-27 2014-09-25 Hybrid photo navigation and mapping
US15/195,921 Active 2034-11-09 US9947100B2 (en) 2013-09-27 2016-06-28 Exterior hybrid photo mapping

Country Status (6)

Country Link
US (4) US20150092048A1 (en)
EP (2) EP3049821B1 (en)
JP (2) JP6072361B2 (en)
KR (2) KR101753267B1 (en)
CN (2) CN105579811B (en)
WO (3) WO2015048397A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9400930B2 (en) 2013-09-27 2016-07-26 Qualcomm Incorporated Hybrid photo navigation and mapping
US20160349057A1 (en) * 2010-10-13 2016-12-01 Elbit Systems Ltd. Multiple data sources pedestrian navigation system
WO2017135697A1 (en) * 2016-02-02 2017-08-10 Samsung Electronics Co., Ltd. A method for correcting drift in a device and the device
CN107577646A (en) * 2017-08-23 2018-01-12 上海莫斐信息技术有限公司 A kind of high-precision track operation method and system
CN108235256A (en) * 2017-12-29 2018-06-29 幻视信息科技(深圳)有限公司 A kind of combined positioning method based on SLAM, device and storage medium
US20180196118A1 (en) * 2015-09-25 2018-07-12 Intel Corporation Vision and radio fusion based precise indoor localization
EP3379202A1 (en) * 2017-03-24 2018-09-26 FotoNation Limited A method for determining bias in an inertial measurement unit of an image acquisition device
WO2018204019A1 (en) 2017-05-05 2018-11-08 Irobot Corporation Methods, systems, and devices for mapping wireless communication signals for mobile robot guidance
CN109029496A (en) * 2018-05-30 2018-12-18 北京市遥感信息研究所 One kind being suitable for large area array optical camera single game eradiation calibrating method
CN109584289A (en) * 2017-09-28 2019-04-05 百度(美国)有限责任公司 The system and method for adapting to state conversion in map building
CN110268445A (en) * 2017-02-16 2019-09-20 高通股份有限公司 It is calibrated automatically using the camera of gyroscope
US10579162B2 (en) 2016-03-24 2020-03-03 Samsung Electronics Co., Ltd. Systems and methods to correct a vehicle induced change of direction
CN111458735A (en) * 2020-04-23 2020-07-28 内蒙古师范大学 Method and system for automatically identifying closed route at specified position
US10776652B2 (en) * 2017-09-28 2020-09-15 Baidu Usa Llc Systems and methods to improve visual feature detection using motion-related data
CN111798536A (en) * 2020-06-15 2020-10-20 北京三快在线科技有限公司 Method and device for constructing positioning map
EP3591623A4 (en) * 2017-03-29 2020-11-04 Rakuten, Inc. Display control device, display control method, and program
US10901420B2 (en) 2016-11-04 2021-01-26 Intel Corporation Unmanned aerial vehicle-based systems and methods for agricultural landscape modeling
US20210350142A1 (en) * 2018-09-17 2021-11-11 Beijing Sankuai Online Technology Co., Ltd. In-train positioning and indoor positioning
WO2022216368A1 (en) * 2021-04-07 2022-10-13 Zebra Technologies Corporation Mobile device locationing
US11486707B2 (en) * 2008-03-28 2022-11-01 Regents Of The University Of Minnesota Vision-aided inertial navigation
US20230060417A1 (en) * 2021-08-31 2023-03-02 Palo Alto Research Center Incorporated System and method for selective image capture on sensor floating on the open sea
US11747477B2 (en) 2018-09-28 2023-09-05 Naver Labs Corporation Data collecting method and system
US11765339B2 (en) 2016-06-30 2023-09-19 Magic Leap, Inc. Estimating pose in 3D space
US11920934B2 (en) 2016-08-19 2024-03-05 Movidius Limited Path planning using sparse volumetric data
EP4250255A3 (en) * 2018-02-02 2024-03-13 Panasonic Intellectual Property Corporation of America Information transmitting method, and client device

Families Citing this family (132)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9733091B2 (en) * 2007-05-31 2017-08-15 Trx Systems, Inc. Collaborative creation of indoor maps
WO2011144968A1 (en) * 2010-05-19 2011-11-24 Nokia Corporation Physically-constrained radiomaps
US9625720B2 (en) * 2012-01-24 2017-04-18 Accipiter Radar Technologies Inc. Personal electronic target vision system, device and method
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US20150219767A1 (en) * 2014-02-03 2015-08-06 Board Of Regents, The University Of Texas System System and method for using global navigation satellite system (gnss) navigation and visual navigation to recover absolute position and attitude without any prior association of visual features with known coordinates
CN107727076B (en) * 2014-05-05 2020-10-23 赫克斯冈技术中心 Measuring system
US9964409B1 (en) * 2014-05-27 2018-05-08 Apple Inc. Localized map generation
US10313656B2 (en) 2014-09-22 2019-06-04 Samsung Electronics Company Ltd. Image stitching for three-dimensional video
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
US9488481B2 (en) * 2014-10-14 2016-11-08 General Electric Company Map presentation for multi-floor buildings
US10091418B2 (en) * 2014-10-24 2018-10-02 Bounce Imaging, Inc. Imaging systems and methods
WO2016069499A1 (en) * 2014-10-26 2016-05-06 Galileo Group, Inc. Methods and systems for surface informatics based detection with machine-to-machine networks and smartphones
US20160295548A1 (en) * 2015-04-03 2016-10-06 Hyundai Motor Company Apparatus for Controlling Message Receiving Mode and Method Thereof
US20160314592A1 (en) * 2015-04-23 2016-10-27 Nokia Technologies Oy Method and apparatus for registration of interior and exterior three dimensional scans based on a semantic feature of a structure
US10033941B2 (en) * 2015-05-11 2018-07-24 Google Llc Privacy filtering of area description file prior to upload
US10878278B1 (en) * 2015-05-16 2020-12-29 Sturfee, Inc. Geo-localization based on remotely sensed visual features
EP3304985B1 (en) * 2015-05-25 2020-03-18 Telefonaktiebolaget LM Ericsson (PUBL) Adaptive measurement report mapping for ue positioning
CN107534840A (en) * 2015-06-29 2018-01-02 尤比库姆特有限责任公司 Frame time in the wireless local area network synchronizes
JP6594686B2 (en) * 2015-07-14 2019-10-23 東急建設株式会社 Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and program
JP6546658B2 (en) * 2015-07-16 2019-07-17 日本電信電話株式会社 Satellite signal receiving apparatus, satellite signal receiving method and program
US10324195B2 (en) 2015-07-27 2019-06-18 Qualcomm Incorporated Visual inertial odometry attitude drift calibration
US9738399B2 (en) * 2015-07-29 2017-08-22 Hon Hai Precision Industry Co., Ltd. Unmanned aerial vehicle control method and unmanned aerial vehicle using same
US10048058B2 (en) * 2015-07-29 2018-08-14 Microsoft Technology Licensing, Llc Data capture system for texture and geometry acquisition
JP2017535086A (en) * 2015-07-31 2017-11-24 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method, imaging system, and program for image processing
DE102015010264A1 (en) * 2015-08-08 2017-02-09 Testo Ag Method for creating a 3D representation and corresponding image acquisition device
WO2017026758A1 (en) * 2015-08-12 2017-02-16 엘지전자 주식회사 Method for measuring location of user equipment in wireless communication system, and apparatus for performing same
US9952354B2 (en) 2015-09-11 2018-04-24 At&T Intellectual Property I, L.P. User equipment local barometric calibration
US10395116B2 (en) * 2015-10-29 2019-08-27 Hand Held Products, Inc. Dynamically created and updated indoor positioning map
US10706615B2 (en) * 2015-12-08 2020-07-07 Matterport, Inc. Determining and/or generating data for an architectural opening area associated with a captured three-dimensional model
US20170178330A1 (en) * 2015-12-17 2017-06-22 Institute For Information Industry System and method for interior mobile mapping
WO2017132607A1 (en) * 2016-01-29 2017-08-03 Noble Sensors, Llc Position correlated ultrasonic imaging
US10132933B2 (en) * 2016-02-02 2018-11-20 Qualcomm Incorporated Alignment of visual inertial odometry and satellite positioning system reference frames
US10591593B2 (en) * 2016-03-19 2020-03-17 Hipscience, Llc Point of reference displacement and motion sensor
US11232583B2 (en) * 2016-03-25 2022-01-25 Samsung Electronics Co., Ltd. Device for and method of determining a pose of a camera
WO2017172778A1 (en) * 2016-03-28 2017-10-05 Sri International Collaborative navigation and mapping
WO2017171908A1 (en) * 2016-04-01 2017-10-05 Intel Corporation Geo-information reporting for vehicle-to-vehicle sidelink communications
US10719983B2 (en) * 2016-04-06 2020-07-21 Anagog Ltd. Three dimensional map generation based on crowdsourced positioning readings
WO2017184040A1 (en) * 2016-04-20 2017-10-26 Telefonaktiebolaget Lm Ericsson (Publ) A wireless device, a positioning node and methods therein for positioning of a wireless device in a wireless communications network
US10234856B2 (en) * 2016-05-12 2019-03-19 Caterpillar Inc. System and method for controlling a machine
US10890600B2 (en) 2016-05-18 2021-01-12 Google Llc Real-time visual-inertial motion tracking fault detection
US11017610B2 (en) * 2016-05-18 2021-05-25 Google Llc System and method for fault detection and recovery for concurrent odometry and mapping
US10802147B2 (en) 2016-05-18 2020-10-13 Google Llc System and method for concurrent odometry and mapping
US9965689B2 (en) 2016-06-09 2018-05-08 Qualcomm Incorporated Geometric matching in visual navigation systems
WO2017214711A1 (en) * 2016-06-14 2017-12-21 Herring Rodney Software-defined radio earth atmosphere imager
TWI594643B (en) * 2016-07-01 2017-08-01 晶睿通訊股份有限公司 Wireless transmitting/receiving system, wireless receiving device and wireless transmitting/receiving method
CN107621265A (en) * 2016-07-14 2018-01-23 百度在线网络技术(北京)有限公司 A kind of method and apparatus for carrying out indoor navigation
US10012517B2 (en) * 2016-08-01 2018-07-03 Infinity Augmented Reality Israel Ltd. Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data
EP3494549A4 (en) * 2016-08-02 2019-08-14 Magic Leap, Inc. Fixed-distance virtual and augmented reality systems and methods
CN106403942B (en) * 2016-08-30 2022-04-29 全球能源互联网研究院 Personnel indoor inertial positioning method based on substation field depth image identification
US10928245B2 (en) * 2016-09-15 2021-02-23 Siteco Gmbh Light measurement using an autonomous vehicle
US10210603B2 (en) * 2016-10-17 2019-02-19 Conduent Business Services Llc Store shelf imaging system and method
TWI619093B (en) * 2016-10-19 2018-03-21 財團法人資訊工業策進會 Visual positioning apparatus, method, and computer program product thereof
CN110062871B (en) * 2016-12-09 2024-01-19 通腾全球信息公司 Method and system for video-based positioning and mapping
GB201621903D0 (en) * 2016-12-21 2017-02-01 Blue Vision Labs Uk Ltd Localisation
CN107223269B (en) * 2016-12-29 2021-09-28 达闼机器人有限公司 Three-dimensional scene positioning method and device
TW201823687A (en) * 2016-12-30 2018-07-01 鴻海精密工業股份有限公司 Navigating system and method for using the same
US10371530B2 (en) * 2017-01-04 2019-08-06 Qualcomm Incorporated Systems and methods for using a global positioning system velocity in visual-inertial odometry
US10893182B2 (en) 2017-01-10 2021-01-12 Galileo Group, Inc. Systems and methods for spectral imaging with compensation functions
CN106899935B (en) * 2017-01-18 2018-08-14 深圳大学 A kind of indoor orientation method and system based on radio receiver and camera
US10816676B2 (en) * 2017-02-25 2020-10-27 Uti Limited Partnership GNSS/INS integration deep inside of inertial sensors
US10327218B2 (en) 2017-03-16 2019-06-18 Qualcomm Incorporated Robust downlink positioning
KR102309297B1 (en) * 2017-03-31 2021-10-06 엘지전자 주식회사 Terminal and method for controlling the same
GB201705767D0 (en) * 2017-04-10 2017-05-24 Blue Vision Labs Uk Ltd Co-localisation
US11740083B2 (en) * 2017-05-04 2023-08-29 Google Llc Methods and apparatus for curbside surveying
US10495762B2 (en) 2017-05-19 2019-12-03 Qualcomm Incorporated Non-line-of-sight (NLoS) satellite detection at a vehicle using a camera
CN107218935A (en) * 2017-05-23 2017-09-29 河南华泰规划勘测设计咨询有限公司 A kind of interior space surveying and mapping data management system
US10417816B2 (en) * 2017-06-16 2019-09-17 Nauto, Inc. System and method for digital environment reconstruction
CN107167143A (en) * 2017-07-05 2017-09-15 乐高乐佳(北京)信息技术有限公司 Guidance quality air navigation aid, device and equipment based on key point
CN107257547B (en) * 2017-07-18 2020-05-22 歌尔科技有限公司 Equipment positioning method and device
CN110019570B (en) * 2017-07-21 2020-03-20 百度在线网络技术(北京)有限公司 Map construction method and device and terminal equipment
GB201712126D0 (en) 2017-07-27 2017-09-13 Crofton Group Ltd 3D model generation system and method
US10375669B2 (en) * 2017-08-04 2019-08-06 Qualcomm Incorporated Methods and systems for locating a mobile device using an asynchronous wireless network
WO2019039983A1 (en) * 2017-08-21 2019-02-28 Sony Mobile Communications Inc Method for reporting of positioning data
US10848199B1 (en) * 2017-08-29 2020-11-24 Syed Karim Systems and methods for communicating data over satellites
CN107703144A (en) * 2017-08-29 2018-02-16 南京航空航天大学 A kind of Wearable and detection method for auxiliary ship station detection
CN107784633B (en) * 2017-09-04 2021-04-13 黄仁杰 Unmanned aerial vehicle aerial image calibration method suitable for plane measurement
CN107421537B (en) * 2017-09-14 2020-07-17 桂林电子科技大学 Object motion attitude sensing method and system based on rigid body grid of inertial sensor
US10802157B2 (en) * 2017-09-28 2020-10-13 Apple Inc. Three-dimensional city models and shadow mapping to improve altitude fixes in urban environments
WO2019083345A1 (en) * 2017-10-27 2019-05-02 엘지전자 주식회사 Method for performing otdoa-related operation by terminal in wireless communication system, and apparatus therefor
CN109831736B (en) * 2017-11-23 2022-01-18 腾讯科技(深圳)有限公司 Data processing method and device, server and client
US11740321B2 (en) * 2017-11-30 2023-08-29 Apple Inc. Visual inertial odometry health fitting
KR101988555B1 (en) * 2017-12-05 2019-06-12 충북대학교 산학협력단 Simultaneous localization and mapping system using illumination invariant image, and method for mapping pointcloud thereof
CN108376415B (en) * 2018-02-13 2022-01-21 中国联合网络通信集团有限公司 Track filling method and device
JP6923739B2 (en) * 2018-02-28 2021-08-25 古野電気株式会社 Navigation equipment, VSLAM correction method, spatial information estimation method, VSLAM correction program, and spatial information estimation program
CN108490388B (en) * 2018-03-13 2021-06-29 同济大学 Multi-source combined indoor positioning method based on UWB and VLC technologies
US10982968B2 (en) 2018-03-29 2021-04-20 Nio Usa, Inc. Sensor fusion methods for augmented reality navigation
US10970924B2 (en) * 2018-06-17 2021-04-06 Foresight Ai Inc. Reconstruction of a scene from a moving camera
CN108988974B (en) * 2018-06-19 2020-04-07 远形时空科技(北京)有限公司 Time delay measuring method and device and system for time synchronization of electronic equipment
WO2020007483A1 (en) * 2018-07-06 2020-01-09 Nokia Technologies Oy Method, apparatus and computer program for performing three dimensional radio model construction
EP3821206A4 (en) 2018-07-13 2021-08-18 Labrador Systems, Inc. Visual navigation for mobile devices operable in differing environmental lighting conditions
CN109074407A (en) * 2018-07-23 2018-12-21 深圳前海达闼云端智能科技有限公司 Multi-source data mapping method, related device and computer-readable storage medium
US11080877B2 (en) 2018-08-02 2021-08-03 Matthew B. Schoen Systems and methods of measuring an object in a scene of a captured image
US11200421B1 (en) * 2018-08-22 2021-12-14 United Services Automobile Association (Usaa) Guided inspection system and method
US10834532B2 (en) * 2018-08-23 2020-11-10 NEC Laboratories Europe GmbH Method and system for wireless localization data acquisition and calibration with image localization
JP7190500B2 (en) * 2018-09-21 2022-12-15 古野電気株式会社 NAVIGATION DEVICE, NAVIGATION SUPPORT INFORMATION GENERATION METHOD, AND NAVIGATION SUPPORT INFORMATION GENERATION PROGRAM
US11061145B2 (en) * 2018-11-19 2021-07-13 The Boeing Company Systems and methods of adjusting position information
CN109520503A (en) * 2018-11-27 2019-03-26 南京工业大学 A kind of square root volume Fuzzy Adaptive Kalman Filtering SLAM method
EP3668197B1 (en) 2018-12-12 2021-11-03 Rohde & Schwarz GmbH & Co. KG Method and radio for setting the transmission power of a radio transmission
CN109859178B (en) * 2019-01-18 2020-11-03 北京航空航天大学 FPGA-based infrared remote sensing image real-time target detection method
US11004224B2 (en) * 2019-01-22 2021-05-11 Velodyne Lidar Usa, Inc. Generation of structured map data from vehicle sensors and camera arrays
US11105735B1 (en) * 2019-02-19 2021-08-31 United Services Automobile Association (Usaa) Systems and methods for detecting properties relating to building components
US10922831B2 (en) * 2019-02-20 2021-02-16 Dell Products, L.P. Systems and methods for handling multiple simultaneous localization and mapping (SLAM) sources and algorithms in virtual, augmented, and mixed reality (xR) applications
EP3933443A4 (en) * 2019-02-25 2022-12-28 Furuno Electric Co., Ltd. Movement information calculation device and movement information calculation method
US11037018B2 (en) 2019-04-09 2021-06-15 Simmonds Precision Products, Inc. Navigation augmentation system and method
US11249197B2 (en) 2019-05-03 2022-02-15 Apple Inc. Image-based techniques for stabilizing positioning estimates
CN110376431A (en) * 2019-05-29 2019-10-25 国网宁夏电力有限公司电力科学研究院 A kind of solution for measuring acquisition abnormity failure
US20220306089A1 (en) * 2019-06-17 2022-09-29 Rohit Seth Relative Position Tracking Using Motion Sensor With Drift Correction
CN110519701B (en) * 2019-08-15 2021-02-12 广州小鹏汽车科技有限公司 Positioning information creating method, vehicle-mounted terminal, server device and positioning system
WO2021035471A1 (en) * 2019-08-26 2021-03-04 Beijing Voyager Technology Co., Ltd. Systems and methods for positioning a target subject
US10908299B1 (en) 2019-08-30 2021-02-02 Huawei Technologies Co., Ltd. User equipment positioning apparatus and methods
CN110634104B (en) * 2019-09-05 2022-11-18 北京智行者科技股份有限公司 Multi-map splicing method and device
CN110706356B (en) * 2019-09-19 2023-06-16 阿波罗智联(北京)科技有限公司 Path drawing method, path drawing device, electronic equipment and storage medium
CN110645986B (en) * 2019-09-27 2023-07-14 Oppo广东移动通信有限公司 Positioning method and device, terminal and storage medium
CN110702139B (en) * 2019-09-29 2021-08-27 百度在线网络技术(北京)有限公司 Time delay calibration method and device, electronic equipment and medium
CA3157392A1 (en) * 2019-10-11 2021-04-15 Foundat Pty Ltd Geographically referencing an item
US11417104B2 (en) * 2019-11-01 2022-08-16 Walmart Apollo, Llc Systems and methods for automatically determining location of an object inside a retail store
CN110866927B (en) * 2019-11-21 2021-07-20 哈尔滨工业大学 Robot positioning and composition method based on EKF-SLAM algorithm combined with dotted line characteristics of foot
CN111079654A (en) * 2019-12-18 2020-04-28 济南大陆机电股份有限公司 Hydrological equipment information acquisition method and system based on picture recognition
US20230186496A1 (en) * 2020-05-11 2023-06-15 Carnegie Mellon University Vision Sensing Device and Method
US11128636B1 (en) 2020-05-13 2021-09-21 Science House LLC Systems, methods, and apparatus for enhanced headsets
CN113691929B (en) * 2020-05-15 2022-07-26 大唐移动通信设备有限公司 Positioning method and device
EP3944806A1 (en) * 2020-07-29 2022-02-02 Carl Zeiss Vision International GmbH Method for determining the near point, determining the near point distance, determining a spherical refractive index and for producing a spectacle lens and corresponding mobile terminals and computer programs
CN111817775B (en) * 2020-08-31 2020-12-15 亚太卫星宽带通信(深圳)有限公司 Method for updating terminal position in satellite-ground cooperation mode and satellite communication system
EP4050459A1 (en) * 2021-02-24 2022-08-31 V-Labs SA Calibration of a display device
CN112669250B (en) * 2021-03-16 2021-09-17 湖北亿咖通科技有限公司 Track alignment method and electronic equipment
US11494920B1 (en) * 2021-04-29 2022-11-08 Jumio Corporation Multi-sensor motion analysis to check camera pipeline integrity
US20230003544A1 (en) * 2021-06-14 2023-01-05 Astra Navigation, Inc. Embedding a Magnetic Map into an Image File
CN113449058B (en) * 2021-06-25 2023-06-02 安克创新科技股份有限公司 Map data transmission method, cleaning robot and storage medium
US11741631B2 (en) 2021-07-15 2023-08-29 Vilnius Gediminas Technical University Real-time alignment of multiple point clouds to video capture
CN113405555B (en) * 2021-08-19 2021-11-23 智己汽车科技有限公司 Automatic driving positioning sensing method, system and device
WO2023118957A1 (en) * 2021-12-23 2023-06-29 Bosch Car Multimedia Portugal, S.A. System for distributed mapping of atmospheric attenuation in satellite communications
CN114967843A (en) * 2022-03-31 2022-08-30 Oppo广东移动通信有限公司 Terminal accessory, electronic equipment and positioning method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167716A1 (en) * 2002-12-17 2004-08-26 Goncalves Luis Filipe Domingues Systems and methods for controlling a density of visual landmarks in a visual simultaneous localization and mapping system
US20050182518A1 (en) * 2004-02-13 2005-08-18 Evolution Robotics, Inc. Robust sensor fusion for mapping and localization in a simultaneous localization and mapping (SLAM) system
US20050234679A1 (en) * 2004-02-13 2005-10-20 Evolution Robotics, Inc. Sequential selective integration of sensor data
US20080065267A1 (en) * 2006-09-13 2008-03-13 Samsung Electronics Co., Ltd. Method, medium, and system estimating pose of mobile robots
US20100208057A1 (en) * 2009-02-13 2010-08-19 Peter Meier Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
US20160035139A1 (en) * 2013-03-13 2016-02-04 The University Of North Carolina At Chapel Hill Low latency stabilization for head-worn displays

Family Cites Families (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1068489A2 (en) 1998-11-20 2001-01-17 Geometrix, Inc. Vision-assisted camera pose determination
JP2006351024A (en) * 2002-05-24 2006-12-28 Olympus Corp Information presentation system of visual field agreement type, and portable information terminal for use in the same
US7657079B2 (en) 2002-06-28 2010-02-02 Intel Corporation Single constraint at a time (SCAAT) tracking of a virtual reality (VR) display
CN1954628A (en) 2004-05-06 2007-04-25 三菱电机株式会社 Mobile terminal, server, information providing system, communication method of mobile terminal, communication method of server, and information providing method of information providing system
US9250081B2 (en) 2005-03-25 2016-02-02 Irobot Corporation Management of resources for SLAM in large environments
HU3056U (en) * 2005-04-29 2006-03-28 G & G Noevenyvedelmi Es Keresk Construction for making weed map
US7848881B2 (en) 2005-07-05 2010-12-07 Containertrac, Inc. Automatic past error corrections for location and inventory tracking
US8483704B2 (en) 2005-07-25 2013-07-09 Qualcomm Incorporated Method and apparatus for maintaining a fingerprint for a wireless network
US7599789B2 (en) 2006-05-24 2009-10-06 Raytheon Company Beacon-augmented pose estimation
WO2008024772A1 (en) 2006-08-21 2008-02-28 University Of Florida Research Foundation, Inc. Image-based system and method for vehicle guidance and navigation
US7628074B2 (en) 2007-03-15 2009-12-08 Mitsubishi Electric Research Laboratories, Inc. System and method for motion capture in natural environments
US9766074B2 (en) 2008-03-28 2017-09-19 Regents Of The University Of Minnesota Vision-aided inertial navigation
JP5396585B2 (en) * 2008-05-02 2014-01-22 株式会社ジオ技術研究所 Feature identification method
US8259692B2 (en) 2008-07-11 2012-09-04 Nokia Corporation Method providing positioning and navigation inside large buildings
US8538688B2 (en) 2008-11-18 2013-09-17 Nokia Corporation User generated pedestrian and indoor shortcut routes for navigation systems
US20100178934A1 (en) 2009-01-13 2010-07-15 Qualcomm Incorporated Environment-specific measurement weighting in wireless positioning
US20100255856A1 (en) 2009-04-03 2010-10-07 Microsoft Corporation Location Sensing Selection for Mobile Devices
US8174437B2 (en) * 2009-07-29 2012-05-08 Hemisphere Gps Llc System and method for augmenting DGNSS with internally-generated differential correction
US8290511B2 (en) 2009-10-01 2012-10-16 Qualcomm Incorporated Venue application for mobile station position estimation
CN102109348B (en) * 2009-12-25 2013-01-16 财团法人工业技术研究院 System and method for positioning carrier, evaluating carrier gesture and building map
US9157745B2 (en) 2010-01-14 2015-10-13 Qualcomm Incorporated Scalable routing for mobile station navigation with location context identifier
US8855929B2 (en) 2010-01-18 2014-10-07 Qualcomm Incorporated Using object to align and calibrate inertial navigation system
JP2011170599A (en) * 2010-02-18 2011-09-01 Mitsubishi Electric Corp Outdoor structure measuring instrument and outdoor structure measuring method
JP5685828B2 (en) * 2010-03-31 2015-03-18 富士通株式会社 Portable portable terminal, positioning program and positioning system for portable portable terminal
GB2479537B8 (en) 2010-04-12 2017-06-14 Vitec Group Plc Camera pose correction
WO2011144966A1 (en) 2010-05-19 2011-11-24 Nokia Corporation Crowd-sourced vision and sensor-surveyed mapping
US9229089B2 (en) * 2010-06-10 2016-01-05 Qualcomm Incorporated Acquisition of navigation assistance information for a mobile station
US20130202197A1 (en) 2010-06-11 2013-08-08 Edmund Cochrane Reeler System and Method for Manipulating Data Having Spatial Co-ordinates
EP2400261A1 (en) 2010-06-21 2011-12-28 Leica Geosystems AG Optical measurement method and system for determining 3D coordination in a measuring object surface
US8892118B2 (en) 2010-07-23 2014-11-18 Qualcomm Incorporated Methods and apparatuses for use in providing position assistance data to mobile stations
US9234965B2 (en) 2010-09-17 2016-01-12 Qualcomm Incorporated Indoor positioning using pressure sensors
EP2619742B1 (en) 2010-09-24 2018-02-28 iRobot Corporation Systems and methods for vslam optimization
US8676623B2 (en) 2010-11-18 2014-03-18 Navteq B.V. Building directory aided navigation
KR101207535B1 (en) * 2010-12-31 2012-12-03 한양대학교 산학협력단 Image-based simultaneous localization and mapping for moving robot
US20120190379A1 (en) 2011-01-25 2012-07-26 T-Mobile Usa, Inc. Intelligent Management of Location Sensor
US8320939B1 (en) 2011-04-21 2012-11-27 Google Inc. Crowd-sourced information for interior localization and navigation
US8510041B1 (en) 2011-05-02 2013-08-13 Google Inc. Automatic correction of trajectory data
US20120303255A1 (en) 2011-05-26 2012-11-29 INRO Technologies Limited Method and apparatus for providing accurate localization for an industrial vehicle
US8787944B2 (en) * 2011-08-18 2014-07-22 Rivada Research, Llc Method and system for providing enhanced location based information for wireless handsets
WO2013040411A1 (en) 2011-09-15 2013-03-21 Honda Motor Co., Ltd. System and method for dynamic localization of wheeled mobile robots
WO2013049597A1 (en) 2011-09-29 2013-04-04 Allpoint Systems, Llc Method and system for three dimensional mapping of an environment
US9155675B2 (en) 2011-10-12 2015-10-13 Board Of Trustees Of The University Of Arkansas Portable robotic device
CN103105852B (en) * 2011-11-14 2016-03-30 联想(北京)有限公司 Displacement calculates method and apparatus and immediately locates and map constructing method and equipment
US8626198B2 (en) 2011-11-16 2014-01-07 Qualcomm Incorporated Characterizing an indoor structure based on detected movements and/or position locations of a mobile device
US8908914B2 (en) 2012-01-17 2014-12-09 Maxlinear, Inc. Method and system for map generation for location and navigation with user sharing/social networking
WO2013108243A1 (en) 2012-01-18 2013-07-25 Weisman Israel Hybrid-based system and method for indoor localization
CA2864003C (en) 2012-02-23 2021-06-15 Charles D. Huston System and method for creating an environment and for sharing a location based experience in an environment
US8532885B1 (en) 2012-04-04 2013-09-10 Hemisphere Gnss Inc. Automatic GNSS signal allocation between remote and base receivers
JP5966542B2 (en) * 2012-04-10 2016-08-10 富士通株式会社 Trajectory analysis apparatus and trajectory analysis program
US10082584B2 (en) 2012-06-21 2018-09-25 Microsoft Technology Licensing, Llc Hybrid device location determination system
US9576183B2 (en) 2012-11-02 2017-02-21 Qualcomm Incorporated Fast initialization for monocular visual SLAM
US20150092048A1 (en) 2013-09-27 2015-04-02 Qualcomm Incorporated Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040167716A1 (en) * 2002-12-17 2004-08-26 Goncalves Luis Filipe Domingues Systems and methods for controlling a density of visual landmarks in a visual simultaneous localization and mapping system
US20050182518A1 (en) * 2004-02-13 2005-08-18 Evolution Robotics, Inc. Robust sensor fusion for mapping and localization in a simultaneous localization and mapping (SLAM) system
US20050234679A1 (en) * 2004-02-13 2005-10-20 Evolution Robotics, Inc. Sequential selective integration of sensor data
US20080065267A1 (en) * 2006-09-13 2008-03-13 Samsung Electronics Co., Ltd. Method, medium, and system estimating pose of mobile robots
US20100208057A1 (en) * 2009-02-13 2010-08-19 Peter Meier Methods and systems for determining the pose of a camera with respect to at least one object of a real environment
US20160035139A1 (en) * 2013-03-13 2016-02-04 The University Of North Carolina At Chapel Hill Low latency stabilization for head-worn displays

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11519729B2 (en) 2008-03-28 2022-12-06 Regents Of The University Of Minnesota Vision-aided inertial navigation
US11486707B2 (en) * 2008-03-28 2022-11-01 Regents Of The University Of Minnesota Vision-aided inertial navigation
US20160349057A1 (en) * 2010-10-13 2016-12-01 Elbit Systems Ltd. Multiple data sources pedestrian navigation system
US9405972B2 (en) 2013-09-27 2016-08-02 Qualcomm Incorporated Exterior hybrid photo mapping
US9400930B2 (en) 2013-09-27 2016-07-26 Qualcomm Incorporated Hybrid photo navigation and mapping
US9947100B2 (en) 2013-09-27 2018-04-17 Qualcomm Incorporated Exterior hybrid photo mapping
US10571546B2 (en) * 2015-09-25 2020-02-25 Intel Corporation Vision and radio fusion based precise indoor localization
US11467247B2 (en) 2015-09-25 2022-10-11 Intel Corporation Vision and radio fusion based precise indoor localization
US20180196118A1 (en) * 2015-09-25 2018-07-12 Intel Corporation Vision and radio fusion based precise indoor localization
US10565723B2 (en) 2016-02-02 2020-02-18 Samsung Electronics Co., Ltd. Systems and methods for drift correction
CN108604010A (en) * 2016-02-02 2018-09-28 三星电子株式会社 Method and the equipment for the drift in calibration equipment
WO2017135697A1 (en) * 2016-02-02 2017-08-10 Samsung Electronics Co., Ltd. A method for correcting drift in a device and the device
US10579162B2 (en) 2016-03-24 2020-03-03 Samsung Electronics Co., Ltd. Systems and methods to correct a vehicle induced change of direction
US11765339B2 (en) 2016-06-30 2023-09-19 Magic Leap, Inc. Estimating pose in 3D space
US11920934B2 (en) 2016-08-19 2024-03-05 Movidius Limited Path planning using sparse volumetric data
US10901420B2 (en) 2016-11-04 2021-01-26 Intel Corporation Unmanned aerial vehicle-based systems and methods for agricultural landscape modeling
US10928821B2 (en) 2016-11-04 2021-02-23 Intel Corporation Unmanned aerial vehicle-based systems and methods for generating landscape models
CN110268445A (en) * 2017-02-16 2019-09-20 高通股份有限公司 It is calibrated automatically using the camera of gyroscope
US11205283B2 (en) * 2017-02-16 2021-12-21 Qualcomm Incorporated Camera auto-calibration with gyroscope
US10097757B1 (en) 2017-03-24 2018-10-09 Fotonation Limited Method for determining bias in an inertial measurement unit of an image acquisition device
US11223764B2 (en) 2017-03-24 2022-01-11 Fotonation Limited Method for determining bias in an inertial measurement unit of an image acquisition device
US10757333B2 (en) 2017-03-24 2020-08-25 Fotonation Limited Method for determining bias in an inertial measurement unit of an image acquisition device
EP3379202A1 (en) * 2017-03-24 2018-09-26 FotoNation Limited A method for determining bias in an inertial measurement unit of an image acquisition device
EP3591623A4 (en) * 2017-03-29 2020-11-04 Rakuten, Inc. Display control device, display control method, and program
US11308705B2 (en) 2017-03-29 2022-04-19 Rakuten Group, Inc. Display control device, display control method, and program
WO2018204019A1 (en) 2017-05-05 2018-11-08 Irobot Corporation Methods, systems, and devices for mapping wireless communication signals for mobile robot guidance
EP3619004B1 (en) * 2017-05-05 2023-11-08 iRobot Corporation Robot guidance method by mapping wireless communication signals and mobile robot using said method
CN107577646A (en) * 2017-08-23 2018-01-12 上海莫斐信息技术有限公司 A kind of high-precision track operation method and system
US10776652B2 (en) * 2017-09-28 2020-09-15 Baidu Usa Llc Systems and methods to improve visual feature detection using motion-related data
US11175148B2 (en) 2017-09-28 2021-11-16 Baidu Usa Llc Systems and methods to accommodate state transitions in mapping
CN109584289A (en) * 2017-09-28 2019-04-05 百度(美国)有限责任公司 The system and method for adapting to state conversion in map building
CN108235256A (en) * 2017-12-29 2018-06-29 幻视信息科技(深圳)有限公司 A kind of combined positioning method based on SLAM, device and storage medium
EP4250255A3 (en) * 2018-02-02 2024-03-13 Panasonic Intellectual Property Corporation of America Information transmitting method, and client device
CN109029496A (en) * 2018-05-30 2018-12-18 北京市遥感信息研究所 One kind being suitable for large area array optical camera single game eradiation calibrating method
US20210350142A1 (en) * 2018-09-17 2021-11-11 Beijing Sankuai Online Technology Co., Ltd. In-train positioning and indoor positioning
US11747477B2 (en) 2018-09-28 2023-09-05 Naver Labs Corporation Data collecting method and system
CN111458735A (en) * 2020-04-23 2020-07-28 内蒙古师范大学 Method and system for automatically identifying closed route at specified position
CN111798536A (en) * 2020-06-15 2020-10-20 北京三快在线科技有限公司 Method and device for constructing positioning map
US20220326336A1 (en) * 2021-04-07 2022-10-13 Zebra Technologies Corporation Mobile Device Locationing
WO2022216368A1 (en) * 2021-04-07 2022-10-13 Zebra Technologies Corporation Mobile device locationing
US11567163B2 (en) * 2021-04-07 2023-01-31 Zebra Technologies Corporation Mobile device locationing
GB2619879A (en) * 2021-04-07 2023-12-20 Zebra Tech Corp Mobile device locationing
US20230060417A1 (en) * 2021-08-31 2023-03-02 Palo Alto Research Center Incorporated System and method for selective image capture on sensor floating on the open sea
US11917337B2 (en) * 2021-08-31 2024-02-27 Xerox Corporation System and method for selective image capture on sensor floating on the open sea

Also Published As

Publication number Publication date
KR101750469B1 (en) 2017-07-04
KR20160063367A (en) 2016-06-03
US20160307328A1 (en) 2016-10-20
JP6072360B2 (en) 2017-02-01
US9947100B2 (en) 2018-04-17
WO2015048434A1 (en) 2015-04-02
EP3049821B1 (en) 2019-01-02
US9400930B2 (en) 2016-07-26
EP3049820A1 (en) 2016-08-03
CN105556329B (en) 2017-10-10
US20150094952A1 (en) 2015-04-02
CN105556329A (en) 2016-05-04
KR101753267B1 (en) 2017-07-19
US20150094089A1 (en) 2015-04-02
US9405972B2 (en) 2016-08-02
JP6072361B2 (en) 2017-02-01
WO2015048397A1 (en) 2015-04-02
CN105579811A (en) 2016-05-11
EP3049821A2 (en) 2016-08-03
CN105579811B (en) 2019-06-28
WO2015088628A2 (en) 2015-06-18
JP2016540187A (en) 2016-12-22
KR20160061401A (en) 2016-05-31
WO2015088628A3 (en) 2015-07-30
JP2016540186A (en) 2016-12-22
EP3049820B1 (en) 2019-12-25

Similar Documents

Publication Publication Date Title
US20150092048A1 (en) Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation
EP3090407B1 (en) Methods and systems for determining estimation of motion of a device
CN109154501B (en) Geometric matching in a visual navigation system
JP6239659B2 (en) Sensor calibration and position estimation based on vanishing point determination
US9476717B2 (en) Simultaneous localization and mapping by using Earth's magnetic fields
US20150192656A1 (en) Received signal direction determination in using multi-antennas receivers
US9159133B2 (en) Adaptive scale and/or gravity estimation
CN107850673A (en) Vision inertia ranging attitude drift is calibrated
EP2878924A1 (en) Method and system for automatically generating location signatures for positioning using inertial sensors
WO2014113159A1 (en) Orientation determination based on vanishing point computation
US11674807B2 (en) Systems and methods for GPS-based and sensor-based relocalization
EP3155503A1 (en) Methods and systems for calibrating sensors using recognized objects
WO2015168451A1 (en) Locally measured movement smoothing of gnss position fixes
Liu et al. Pushing Location Awareness to Context-Aware Augmented Reality

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRUNNER, CHRISTOPHER;RAMANANDAN, ARVIND;RAMACHANDRAN, MAHESH;AND OTHERS;SIGNING DATES FROM 20141020 TO 20150112;REEL/FRAME:034759/0925

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION