WO2013156987A1 - Proximity-based interface system and method - Google Patents

Proximity-based interface system and method Download PDF

Info

Publication number
WO2013156987A1
WO2013156987A1 PCT/IL2013/000039 IL2013000039W WO2013156987A1 WO 2013156987 A1 WO2013156987 A1 WO 2013156987A1 IL 2013000039 W IL2013000039 W IL 2013000039W WO 2013156987 A1 WO2013156987 A1 WO 2013156987A1
Authority
WO
WIPO (PCT)
Prior art keywords
proximity
mobile device
information
control unit
database
Prior art date
Application number
PCT/IL2013/000039
Other languages
French (fr)
Inventor
Ronny VAN DEN BERGH
Elad DAHABANY LEVY
Avinoam SHAFRAN
Maxim HARPUNSKY
Original Assignee
Xtendi Software Technologies Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xtendi Software Technologies Ltd. filed Critical Xtendi Software Technologies Ltd.
Publication of WO2013156987A1 publication Critical patent/WO2013156987A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Definitions

  • the current invention relates to information data management about viewing objects in the environment and specifically about a proximity-based interface system and method.
  • mobile devices With mobile technology becoming more sophisticated and popular, mobile devices have allowed a user to immediately access digital information about any object as long as the user has access to an Internet connection.
  • a user can obtain information about a particular exhibit item while visiting a museum through his mobile internet-connected device.
  • Another example a user may use his mobile device to access an online search engine to retrieve information about an unfamiliar item he sees while walking on a street.
  • GUI Graphical User Interface
  • the method includes determining a distance relationship between a user's body and the mobile device, and changing a configuration of at least one input unit and at least one output unit depending upon the determined distance relationship.
  • a mobile device captures an image of a real-world object where the image has content information that can be used to control a mixed reality object through an offered command set.
  • the mixed reality object can be real, virtual, or a mixture of both real and virtual.
  • a proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising: at least one input unit configured to sense an object, and adapted to provide sensing information; a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to: receive the sensing information from the at least one input unit, recognize the object, measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity; and at _ least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer.
  • an anti-jitter subsystem is adapted to select an information layer.
  • the at least one input unit is integrated in the mobile device and/or is positionable remotely from the mobile device.
  • the database is integrated in the mobile device and/or remotely from the mobile device.
  • a method of enabling user interaction with an obj ect using a mobile device dependent on a proximity between the object and the mobile device comprising the steps of: configuring at least one input unit to sense an object and to provide sensing information; taking a control unit to communicate with a database having information layers about the object associated with respective object distances, the control unit: receiving the sensing information from the at least one input unit, recognizing the object, measuring the proximity from the mobile device to the object, and selecting an information layer from the database based on the measured proximity; and configuring at least one output unit to communicate with the control unit, and to provide output to the user based upon the selected information layer.
  • selecting an information layer is further performed using an anti- jitter subsystem.
  • a proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising: at least one input unit configured to sense an object, and to adapted provide sensing information; a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to: receive the sensing information from the at least one input unit, recognize the object, measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity; an anti-jitter subsystem associated with the control unit, the anti-jitter subsystem adapted to further select to an information level based upon a change of the measured proximity; and at least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer.
  • FIG 1 is block diagram of a basic structure of a proximity-based interface system, in accordance with embodiments of the present invention.
  • FIG 2 is a functional block diagram of the proximity-based interface system of FIG 1, in accordance with embodiments of the present invention.
  • FIGS 3 A - 3C are pictorial representations of the mobile device, incorporating the proximity-based interface system, used to view an object at three respective exemplary distances, in accordance with embodiments of the present invention
  • FIG 4 is a flow chart of a method of using a proximity-based interface system, in accordance with embodiments of the present invention.
  • FIGS 5 A and 5B are functional block diagrams of the proximity-based interface method of FIG 4, in accordance with embodiments of the present invention.
  • FIG 6 is a schematic diagram of elements of which is a schematic diagram of D, the distance between the mobile device and object, introducing elements of an anti-jitter subsystem of the proximity-based interface system, in accordance with embodiments of the present invention
  • FIGS 7, 8, and 9 are schematic diagrams of the proximity marker and functionality of the anti-jitter subsystem shown in FIG 6, in accordance with embodiments of the present invention.
  • the present invention relates to information data management about objects in the environment and specifically about a proximity-based interface system and method.
  • mobile device examples include, but are not limited to: a mobile phone; a tablet; a computer notebook; a PDA, a smartphone; a mobile gaming console; and augmented reality glasses.
  • Embodiments of the current invention take advantage of a proximity- based interface system, having a Natural User Interface (NUI), which enables a natural interaction with an object using a mobile device based on the proximity and/or distance between the object and the mobile device.
  • NUI Natural User Interface
  • a NUI proximity -based interface system serves to shift control from the device interface to the information itself. Expressed another way, as the user adjusts his distance from an object (i.e. "proximity”, as used in the specification and claims which follow herebelow) he simultaneously controls the interface and the level of information that can be obtained regarding the object.
  • NUI is based on a natural human behavior, such as the natural desire to approach an object to reveal more specific information about the object.
  • NUI allows a user to literally "take a step back" to obtain information about an object in a broader context of the object's surrounding— as described further hereinbelow. NUI enables a natural connection between digital information and objects in the physical world.
  • FIGS 1 is block diagram of a proximity-based interface system 4, in accordance with embodiments of the present invention.
  • Proximity-based interface system 4 comprises: a mobile device (not shown in the figure), at least one input unit 5, a controller 7, and at least one output unit 9.
  • the at least one input and output units may be incorporated within or outside the mobile device.
  • proximity based interface system 4 is incorporated completely within the mobile device.
  • FIG 2 is a functional block diagram of proximity-based interface system 4, in accordance with
  • proximity-based interface system 4 is identical in notation,
  • Input unit 5 further includes a sensor 11 on the mobile device and/or an external sensor 13, meaning a sensor not located on the mobile device.
  • sensor 5 provides sensing information used to measure distance, as known in the art.
  • the sensor may incorporate technology such as, but not limited to: capacitive, electric field, inductive, Hall effect, reed, eddy current, magneto-resistant, optical shadow, optical-visual light, optical IR, optical color recognition, acoustic, radar, heat, sonar, and conductive or resistive
  • Controller 7 receives the sensing information from the input unit and further includes an object recognition module 15 and a distance
  • Controller 7 serves to recognize an object and to measure the distance from the exemplary mobile device to the object, as further described hereinbelow.
  • Output unit 9 further includes a feedback module 19.
  • Feedback module 19 is configured to output information relative to the object to one or more users (not shown in the present figure).
  • Feedback module may be, but is not limited to: a display for image/video data; a speaker; and a vibrator unit.
  • distance measurement i.e. "proximity”
  • distance measurement a term interchangeably used hereinbelow with “proximity”— is further intended to mean the process of calculating a distance between the mobile device and an object.
  • one or more controller units are used for object recognition and/or distance measurement. Alternatively or optionally, only one controller unit is used for both processes, or different controller units are used for each process.
  • Input units and controller units may be configured within the mobile device, and/or they may be configured externally to the mobile device. Communication between the input unit, controller unit, and the mobile device may be wired or wireless, as known in the art.
  • one or more output units generate visual/aural/tactile feedback in response to control unit signals, as described further herein below.
  • the input and the output units may be linked solely by internal processing units (not shown in the figures) or by a combination of external processing units and internal processing units (not shown in the figures), as known in the art.
  • exemplary output units include, but are not limited to: an LCD display, a visual projection, an audio output unit, and a vibration unit— all as known in the art.
  • FIGS 3 A - 3C are pictorial representations of a mobile device 25, incorporating the proximity-based interface system, used to view an object at three respective exemplary distances (Distance 1, 2, and 3), in accordance with embodiments of the present invention.
  • mobile device 25 is typically moved and positioned by a user 26 to view an object 30.
  • the proximity-based interface system having the input, controller, and output units described hereinabove— functions to sense distance and to output information to the user.
  • FIGS 3A-3C illustrate three exemplary distances (Distance 1, 2, and 3), embodiments of the current invention are not limited to three distances.
  • each of the distances may be further subdivided into sub-distance ranges, as further described below.
  • a respective "information layer” (information layers 1, 2, and 3: representing different aspects of the object, dependent upon the distance) is output to the user via the output unit.
  • the output unit is an LCD display of mobile device 25 — however other user outputs may be employed, as noted hereinabove.
  • Mobile device 25 may be configured to operate within a communication system that transmits data over wireless and/or wired communication means and/or satellite-based communication systems.
  • wireless systems may include, but are not limited to: cellular communication networks (GSM, LTE...); WiFi; WiMax; Bluetooth; radio wave transmission; infra-red signaling; and other methods of wireless signal transmission using either public or proprietary protocols.
  • FIG 4, is a flow chart of a method of using a proximity-based interface system, in accordance with embodiments of the present invention. In general, the method involves two processes that may take place independently of one another: object recognition; and feedback given to the user by the output device, as described hereinbelow.
  • step 40 environment scanning, one or more sensors are used to sense/scan for the object.
  • the at least one input unit is used to perform step 40.
  • step 42 object recognized, a determination is made to proceed to measure/calculate the distance to the object— "YES", to step 44, or— “NO", to transfer control back to step 40 to rescan the environment.
  • Recognition of the object in step 42 is performed by comparing input data (obtained from step 40) to stored data of the object for recognition. (Stored data for object recognition may be located on-board the exemplary mobile device or it may be remotely located, as known in the art.)
  • step 46 relevant feedback available, the controller is used to try to retrieve relevant digital information about the object associated with a plurality of distances from an on-board and/or a remote database. If there is no relevant digital information about the object corresponding to the current distance, "NO”, control is reverted to step 40 to repeat scan and the subsequent process. If relevant information is available, "YES”, control is transferred to step 48, feedback given by output unit. The information is transferred from the controller to the output unit and/or the controller controls the one or more output units to give feedback to the user in the subsequent exemplary steps: step 50, device UT reconfiguration; step 52 Augmented reality; step 54, sound and/or vibration; step 56 device
  • Steps 50 through 58 are exemplary only and other means and methods of providing feedback are included in embodiments of the current invention.
  • Control is presently reverted (not shown in the figure) to step 40 to prepare for the next environment scan.
  • Step 40 to prepare for the next environment scan.
  • step 48 feedback given to output unit, takes place and/or upon user's request.
  • FIGS 5A and 5B are functional block diagrams of the proximity -based interface method of FIG 4, in accordance with embodiments of the present invention.
  • mobile device 25, user 26, object 30, and external sensor 13 are identical in notation, configuration, and functionality to that shown in FIGS 3A-C, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove in previous figures.
  • step 60 control begins with user 26 moving mobile device 25 and object30 is scanned (not shown in the current figure).
  • step 61 the control unit serves to recognize the object. (As noted previously in FIG 4, if no object is recognized, control is reverted to another scan.)
  • step 62 the distance between the mobile device and the object is calculated, based on the scanned information— from the one or more input units. Additionally, the data from the input unit(s) is compared with the local or remote storage to ensure there is digital information corresponding to the distance (not shown in the current figure).
  • step 63 a request is sent to storage 65 (on board and/or remote) by the control unit to retrieve relevant digital information about the recognized object.
  • step 67 the control unit receives feedback information from storage and in step 69; the control unit controls the output unit to provide feedback to the user as previously described hereinabove.
  • step 61a external sensor 13 scans object 30. Control then proceeds to step 62, as described hereinabove in FIG 5 A.
  • the proximity- based interface system may optionally or alternatively function with an on board sensor and an external (remote) sensor.
  • FIG 6, is a schematic diagram of D, the distance between mobile device 25 and object 30, introducing elements of an anti -jitter subsystem of the proximity-based interface system, in accordance with embodiments of the present invention.
  • mobile device 25 and object 30 are identical in notation, configuration, and functionality to that shown in FIGS 3A-C and 5A and 5B, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove in previous figures.
  • jitter is intended to mean instability and/or rapid, undesirable changes in output due to a hysteresis- type situation.
  • anti-jitter is intended to mean a suppression of a jitter in output, as known in the art.
  • D is mathematically/digitally divided into a plurality of sub-distance ranges (each sub-distance having its respective
  • respective proximity markers have pairs of "regions" before and after the marker, respectively called “before proximity marker” BPM and “after proximity marker” APM.
  • BPM and APM are defined by the proximity-based system dynamically, statically in advance, or both, to have distance values smaller than respective dl, d2, and d3 values.
  • BPMl 74 and APMl 76 are associated with PM1 70; whereas BPM2 78 and APM2 79 are associated with PM2 72. The functionality and interaction of BPMl 74, APMl 76, BPM2 78, and APM2 79 are described further hereinbelow.
  • the output unit display may change or a sound may be played— as described hereinabove.
  • the output unit display may change to an alternate display or an alternate sound may be played.
  • FIGS 7, 8, and 9 are schematic diagrams of the proximity marker and functionality of the anti-jitter subsystem of FIG 6, in accordance with embodiments of the present invention.
  • PM1 70, BPM1 74 , and APM1 76 are identical in notation, configuration, and functionality to that shown in 6, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove in previous figures.
  • an initial position of the mobile device is indicated 81 in view (a). If, as shown in view (b), the next position 83 reflects a position of the mobile device just to the right of PMl 70 (and into region d2), a jitter condition could exist for outputs associated with information level 1 and information level 2 due to any of the reasons noted hereinabove. To avoid a jitter condition and to contribute to more stability, the system mathematically moves PMl 70 to a new position PMl ' 85 to ensure next position 83 is sufficiently distanced from PMl '.
  • an initial position 85 indicates the mobile device positioned within BPMl 74 but close to PMl 70.
  • the system mathematically moves the initial position to a new initial position 86, away from PMl 70 and out of BPMl 74.
  • the initial position is maintained, but PMl 70 , along with the associated BPM and APM, are moved to a new position to the right (not shown in the current figure), yielding an effectively similar relative separation between the initial position, the PMl, APM, and BPM as shown in view (b).
  • an initial position 88 situates the mobile device within APMl 76 but close to PMl 70.
  • the system mathematically moves the initial position to a new initial position 89, into the region of dl, away from PMl 70 and out of BPM 1 74.
  • PMl 70 along with associated BPM 74 and APM 76, are moved to a new position to the right (not shown in the current figure), yielding an effectively similar relative separation between the initial position, the PMl, APM, and BPM as shown in view (b).
  • embodiments of the current invention optionally or alternatively include the system acting to stabilize and adjust for a "narrow-angle view" of the object ⁇ a configuration wherein the initial position is relatively biased with regard to the PMl into region dl and out of the APM— thereby yielding "information layer 2" user feedback.
  • Exemplary applications of the proximity-based information system disclosed hereinabove in embodiments of the current invention include, but not limited to, the following. Smart phones/Tablets
  • Smartphones and Tablet computers comprise: a LCD screen, a camera and a network connection that allow them to support the embodiment of this invention.
  • the camera can be utilized for object recognition and proximity measurement.
  • a user with a smartphone/tablet and the proximity based interface system installed on it can visit a museum, aim the smartphone/tablet at one of the objects, or paintings on the wall, depending on the distance between the smartphone/tablet and the object or painting, digital content will appear on the smartphone's display in relation to the object or painting.
  • AR Glasses are comprised of lenses that have the capability of displaying digital content, camera and a network connection that allow them to support the embodiment of this invention.
  • the camera can be utilized for object recognition and proximity measurement.
  • a user wearing the AR Glasses is able to walk through his town while information about the surroundings appear on the lenses of the glasses as he continues to move through his surroundings.

Abstract

A proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising: at least one input unit configured to sense an object, and adapted to provide sensing information; a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to: receive the sensing information from the at least one input unit, recognize the object, measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity; and at least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer.

Description

PROXIMITY-BASED INTERFACE SYSTEM AND METHOD
CROSS-REFERENCE TO RELATED APPLICATIONS
The present PCT application claims priority from US Provisional Patent Application no. 61/624,452, filed 16 April 2012, whose disclosure is incorporated herein by reference.
FIELD AND BACKGROUND OF THE INVENTION
The current invention relates to information data management about viewing objects in the environment and specifically about a proximity-based interface system and method.
Prior to the digital age, people learned about their surroundings through exploration. Methods of exploration have developed over time, from using our natural senses, such as smell and observation, to the introduction of tools, such as the magnifying glass, allowing us to explore objects from different physical perspectives and to expand our knowledge about the objects. With the introduction of the printing press information about objects was made available in the form of books and journals. One could then access printed information to learn about objects, rather than directly exploring the surroundings. The digital revolution has transferred available printed information into a digital format, making the information accessible throughout the world and easily searchable.
As technology continues to develop at an ever-increasing rate, it is becoming easier to create digital content. The benefits of this development are obvious: information about every object becomes more available and more accessible. However, this development has also led to an "information overload". With all the information available to us, how do we differentiate between information which is relevant and information that is not?
Web portals and new search engines became popular methods during the 1990s, to navigate through digital content, using keywords to match a search query to the content. Although this search engine method has been widely used, it does not reflect intuitive, human behavior. To be able to choose the correct keywords, a typical searcher must have a basic understanding of the impact of how keywords effect search results, in addition to a prior basic knowledge about the information for which he is searching.
With mobile technology becoming more sophisticated and popular, mobile devices have allowed a user to immediately access digital information about any object as long as the user has access to an Internet connection. By way of an example, a user can obtain information about a particular exhibit item while visiting a museum through his mobile internet-connected device. Another example - a user may use his mobile device to access an online search engine to retrieve information about an unfamiliar item he sees while walking on a street.
Mobile devices provide users with an interface allowing them to access information. Usually a Graphical User Interface (GUI) is used, which is controlled through physical gestures— i.e. touch and/or finger movements or external control devices such as keyboard or a mouse. One problem is that the large quantity of information and the limits of mobile devices (such as a small screen, for example) sometimes make it difficult or cumbersome for a user to navigate correctly through various levels of information.
Additionally, currently available tools, such as online search engines, may not meet user requirements. Taking the aforementioned example of the person walking along the street, how would a person intuitively know what keywords to use to retrieve relevant/desirable information? Currently available tools for filtering digital information serve to break the link between the physical world and the digital information. Furthermore, most, if not all current methods of obtaining digital information are not natural/intuitive to human behavior but are rather forced upon users by the functional limitations of the GUI.
One valid question that may be asked is: "Are current methods of data interaction natural, or have we adapted our perception of interaction to the abilities of the technology and functionality of the GUI?" By way of an exemplary response to this question, have you ever actually "pinched" a painting to make it bigger?!
Some relevant prior art related to the problems noted hereinabove are listed hereinbelow.
US Patent Application no 2009/0197615, whose disclosure is incorporated herein by reference, in which Kim et al. describe a device and method for controlling the user interface of a mobile device, are discussed. According to an embodiment, the method includes determining a distance relationship between a user's body and the mobile device, and changing a configuration of at least one input unit and at least one output unit depending upon the determined distance relationship.
Tsai et al., in US Patent Application no. 2010/0174421, whose disclosure is incorporated herein by reference, describes a mobile user interface suitable for mobile computing devices uses device position/orientation in real space to select a portion of content that is displayed. Data from motion, distance or position sensors are used to determine the relative
position/orientation of the device with respect to the content to select the portion for display. Content elements can be selected by centering the display on the desired portion, obviating the need for cursors and pointing devices. Magnification can be manipulated by moving the device away from or towards the user. 3-D content viewing may be enabled by sensing the device orientation and displaying content that is above or below the display in 3-D virtual space. US Patent no 7,564,469, whose disclosure is incorporated herein by reference, in which Cohen describes methods of interacting with a mixed reality. A mobile device captures an image of a real-world object where the image has content information that can be used to control a mixed reality object through an offered command set. The mixed reality object can be real, virtual, or a mixture of both real and virtual.
US Patent no 2008/0144549 Al, whose disclosure is incorporated herein by reference, in which Marques describes a wireless proximity based information system that enables users to gather and store information about objects in their immediate surroundings. The goal of the wireless proximity based information system is to provide people with relevant information where and when they want it and to provide on the spot information about objects of any type to users who are within the immediate proximity of the object.
Some shortcomings found in the relevant prior art are:
• Changes in information are displayed on a stationary screen and not on the user's mobile device's GUI.
• Additional digital information is not supplied regarding an object that is viewed through the mobile device.
• Changing proximity to the object does not provide additional
information.
• No differentiation between the context of information based on the proximity of the user from the object. There is therefore a need for a more intuitive system and natural way learning about the environment and to do so by taking advantage of both physical proximity and digital information.
SUMMARY OF THE INVENTION
According to embodiment of the current invention, there is provided a proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising: at least one input unit configured to sense an object, and adapted to provide sensing information; a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to: receive the sensing information from the at least one input unit, recognize the object, measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity; and at _ least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer. Preferably, an anti-jitter subsystem is adapted to select an information layer. Most preferably, the at least one input unit is integrated in the mobile device and/or is positionable remotely from the mobile device. Typically, the database is integrated in the mobile device and/or remotely from the mobile device.
According to embodiments of the current invention, there is further provided a method of enabling user interaction with an obj ect using a mobile device dependent on a proximity between the object and the mobile device, the method comprising the steps of: configuring at least one input unit to sense an object and to provide sensing information; taking a control unit to communicate with a database having information layers about the object associated with respective object distances, the control unit: receiving the sensing information from the at least one input unit, recognizing the object, measuring the proximity from the mobile device to the object, and selecting an information layer from the database based on the measured proximity; and configuring at least one output unit to communicate with the control unit, and to provide output to the user based upon the selected information layer.
Preferably, selecting an information layer is further performed using an anti- jitter subsystem.
According to embodiments of the current invention, there is further provided a proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising: at least one input unit configured to sense an object, and to adapted provide sensing information; a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to: receive the sensing information from the at least one input unit, recognize the object, measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity; an anti-jitter subsystem associated with the control unit, the anti-jitter subsystem adapted to further select to an information level based upon a change of the measured proximity; and at least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
FIG 1 is block diagram of a basic structure of a proximity-based interface system, in accordance with embodiments of the present invention;
FIG 2 is a functional block diagram of the proximity-based interface system of FIG 1, in accordance with embodiments of the present invention;
FIGS 3 A - 3C are pictorial representations of the mobile device, incorporating the proximity-based interface system, used to view an object at three respective exemplary distances, in accordance with embodiments of the present invention;
FIG 4 is a flow chart of a method of using a proximity-based interface system, in accordance with embodiments of the present invention;
FIGS 5 A and 5B are functional block diagrams of the proximity-based interface method of FIG 4, in accordance with embodiments of the present invention.
FIG 6 is a schematic diagram of elements of which is a schematic diagram of D, the distance between the mobile device and object, introducing elements of an anti-jitter subsystem of the proximity-based interface system, in accordance with embodiments of the present invention; and FIGS 7, 8, and 9 are schematic diagrams of the proximity marker and functionality of the anti-jitter subsystem shown in FIG 6, in accordance with embodiments of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS
The present invention relates to information data management about objects in the environment and specifically about a proximity-based interface system and method.
In the description hereinbelow, examples of the term "mobile device" include, but are not limited to: a mobile phone; a tablet; a computer notebook; a PDA, a smartphone; a mobile gaming console; and augmented reality glasses.
Embodiments of the current invention take advantage of a proximity- based interface system, having a Natural User Interface (NUI), which enables a natural interaction with an object using a mobile device based on the proximity and/or distance between the object and the mobile device. Unlike current methods of interaction using digital information, wherein a user controls a GUI of a mobile device to obtain information relevant to the user's needs, a NUI proximity -based interface system, as further described hereinbelow in embodiments of the current invention, serves to shift control from the device interface to the information itself. Expressed another way, as the user adjusts his distance from an object (i.e. "proximity", as used in the specification and claims which follow herebelow) he simultaneously controls the interface and the level of information that can be obtained regarding the object.
NUI is based on a natural human behavior, such as the natural desire to approach an object to reveal more specific information about the object.
Conversely, NUI allows a user to literally "take a step back" to obtain information about an object in a broader context of the object's surrounding— as described further hereinbelow. NUI enables a natural connection between digital information and objects in the physical world.
Reference is currently made to FIGS 1, which is block diagram of a proximity-based interface system 4, in accordance with embodiments of the present invention. Proximity-based interface system 4 comprises: a mobile device (not shown in the figure), at least one input unit 5, a controller 7, and at least one output unit 9. The at least one input and output units may be incorporated within or outside the mobile device. In one embodiment of the current invention, proximity based interface system 4 is incorporated completely within the mobile device.
Reference is currently made to FIG 2, which is a functional block diagram of proximity-based interface system 4, in accordance with
embodiments of the present invention. Apart from differences described below, proximity-based interface system 4 is identical in notation,
configuration, and functionality to that shown in FIG 1, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove. Input unit 5 further includes a sensor 11 on the mobile device and/or an external sensor 13, meaning a sensor not located on the mobile device. As further described hereinbelow, sensor 5 provides sensing information used to measure distance, as known in the art. The sensor may incorporate technology such as, but not limited to: capacitive, electric field, inductive, Hall effect, reed, eddy current, magneto-resistant, optical shadow, optical-visual light, optical IR, optical color recognition, acoustic, radar, heat, sonar, and conductive or resistive
technologies. Controller 7 receives the sensing information from the input unit and further includes an object recognition module 15 and a distance
measurement module 17. Controller 7 serves to recognize an object and to measure the distance from the exemplary mobile device to the object, as further described hereinbelow.
Output unit 9 further includes a feedback module 19. Feedback module 19 is configured to output information relative to the object to one or more users (not shown in the present figure). Feedback module may be, but is not limited to: a display for image/video data; a speaker; and a vibrator unit.
Embodiments of the current invention include two required processes:
• object recognition, as known in the art and;
• distance measurement (i.e. "proximity") between the mobile device and an object. Distance measurement— a term interchangeably used hereinbelow with "proximity"— is further intended to mean the process of calculating a distance between the mobile device and an object.
In an embodiment of the present invention one or more controller units are used for object recognition and/or distance measurement. Alternatively or optionally, only one controller unit is used for both processes, or different controller units are used for each process. Input units and controller units may be configured within the mobile device, and/or they may be configured externally to the mobile device. Communication between the input unit, controller unit, and the mobile device may be wired or wireless, as known in the art.
In an embodiment of the present invention one or more output units generate visual/aural/tactile feedback in response to control unit signals, as described further herein below.
With reference to functionality of proximity-based interface system 4 and the mobile device, the input and the output units may be linked solely by internal processing units (not shown in the figures) or by a combination of external processing units and internal processing units (not shown in the figures), as known in the art. As noted hereinabove, exemplary output units include, but are not limited to: an LCD display, a visual projection, an audio output unit, and a vibration unit— all as known in the art.
Reference is presently made FIGS 3 A - 3C, which are pictorial representations of a mobile device 25, incorporating the proximity-based interface system, used to view an object at three respective exemplary distances (Distance 1, 2, and 3), in accordance with embodiments of the present invention. In view (a) mobile device 25 is typically moved and positioned by a user 26 to view an object 30. When mobile device 25 approaches the object, the proximity-based interface system— having the input, controller, and output units described hereinabove— functions to sense distance and to output information to the user. Although FIGS 3A-3C illustrate three exemplary distances (Distance 1, 2, and 3), embodiments of the current invention are not limited to three distances. Furthermore, each of the distances (Distance 1, 2, 3) may be further subdivided into sub-distance ranges, as further described below. As the mobile device is moved closer or further from the object (corresponding to distances 1, 2, and 3— and to sub-distance ranges within each distance), a respective "information layer" (information layers 1, 2, and 3: representing different aspects of the object, dependent upon the distance) is output to the user via the output unit. In the examples shown in FIGS 3A-3C, shown in views (b) and detail (c), the output unit is an LCD display of mobile device 25 — however other user outputs may be employed, as noted hereinabove.
Various examples of how the configuration of at least one or more of the output units may be changed based on distance detection (including sub-distance ranges) are discussed hereinbelow in reference to additional figures.
Embodiments of the current invention include software
implementation, having separate software modules, such as procedures and functions, each performing one or more of the functions and operations described hereinbelow. Software modules are written in any suitable
programming language and stored in memory and executed by a controller or processor, as known in the art. Mobile device 25 may be configured to operate within a communication system that transmits data over wireless and/or wired communication means and/or satellite-based communication systems. Such wireless systems may include, but are not limited to: cellular communication networks (GSM, LTE...); WiFi; WiMax; Bluetooth; radio wave transmission; infra-red signaling; and other methods of wireless signal transmission using either public or proprietary protocols. Reference is currently made to FIG 4, which is a flow chart of a method of using a proximity-based interface system, in accordance with embodiments of the present invention. In general, the method involves two processes that may take place independently of one another: object recognition; and feedback given to the user by the output device, as described hereinbelow. In step 40, environment scanning, one or more sensors are used to sense/scan for the object. As noted hereinabove in FIGS 1 and 2, describing the proximity-based interface system, the at least one input unit is used to perform step 40.
In step 42, object recognized, a determination is made to proceed to measure/calculate the distance to the object— "YES", to step 44, or— "NO", to transfer control back to step 40 to rescan the environment. Recognition of the object in step 42 is performed by comparing input data (obtained from step 40) to stored data of the object for recognition. (Stored data for object recognition may be located on-board the exemplary mobile device or it may be remotely located, as known in the art.)
If the object is not recognized, or there is no object visible, control is transferred back to step 40, and the process repeats until an object is recognized in step 42. Once the object has been recognized the distance between the mobile device and object is calculated in step 44. In step 46, relevant feedback available, the controller is used to try to retrieve relevant digital information about the object associated with a plurality of distances from an on-board and/or a remote database. If there is no relevant digital information about the object corresponding to the current distance, "NO", control is reverted to step 40 to repeat scan and the subsequent process. If relevant information is available, "YES", control is transferred to step 48, feedback given by output unit. The information is transferred from the controller to the output unit and/or the controller controls the one or more output units to give feedback to the user in the subsequent exemplary steps: step 50, device UT reconfiguration; step 52 Augmented reality; step 54, sound and/or vibration; step 56 device
functionality activation or reconfiguration; step 58. Steps 50 through 58 are exemplary only and other means and methods of providing feedback are included in embodiments of the current invention.
Control is presently reverted (not shown in the figure) to step 40 to prepare for the next environment scan. Optionally or alternatively,
environmental scanning continues to take place while step 48, feedback given to output unit, takes place and/or upon user's request.
Reference is currently made to FIGS 5A and 5B, which are functional block diagrams of the proximity -based interface method of FIG 4, in accordance with embodiments of the present invention. Apart from differences described below, mobile device 25, user 26, object 30, and external sensor 13 are identical in notation, configuration, and functionality to that shown in FIGS 3A-C, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove in previous figures.
In FIG 5A, step 60, control begins with user 26 moving mobile device 25 and object30 is scanned (not shown in the current figure). In step 61, the control unit serves to recognize the object. (As noted previously in FIG 4, if no object is recognized, control is reverted to another scan.) In step 62, the distance between the mobile device and the object is calculated, based on the scanned information— from the one or more input units. Additionally, the data from the input unit(s) is compared with the local or remote storage to ensure there is digital information corresponding to the distance (not shown in the current figure). In step 63, a request is sent to storage 65 (on board and/or remote) by the control unit to retrieve relevant digital information about the recognized object. In step 67 the control unit receives feedback information from storage and in step 69; the control unit controls the output unit to provide feedback to the user as previously described hereinabove.
In FIG 5B, control and steps are similar to those described in FIG 5 A, except for step 61a in the current figure. In step 61a, external sensor 13 scans object 30. Control then proceeds to step 62, as described hereinabove in FIG 5 A. It is noted that in embodiments of the current invention the proximity- based interface system may optionally or alternatively function with an on board sensor and an external (remote) sensor.
Reference is presently made to FIG 6, which is a schematic diagram of D, the distance between mobile device 25 and object 30, introducing elements of an anti -jitter subsystem of the proximity-based interface system, in accordance with embodiments of the present invention. Apart from differences described below, mobile device 25 and object 30 are identical in notation, configuration, and functionality to that shown in FIGS 3A-C and 5A and 5B, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove in previous figures.
In the specification and claims which follow, the term "jitter" is intended to mean instability and/or rapid, undesirable changes in output due to a hysteresis- type situation. As such, the term "anti-jitter" is intended to mean a suppression of a jitter in output, as known in the art.
Distance D is determined as described in previous figures.
Additionally, as noted hereinabove, D is mathematically/digitally divided into a plurality of sub-distance ranges (each sub-distance having its respective
"information layer", as previously described). Three exemplary sub-distances are indicated in FIG 6 as dl, d2, and d3, however more or less sub-distances may be used. The demarcation between dl and d2 is called "Proximity Marker 1 (PM1) 70 and the demarcation between d2 and d3 is called "Proximity Marker 2 (PM2) 72.
Additionally, respective proximity markers have pairs of "regions" before and after the marker, respectively called "before proximity marker" BPM and "after proximity marker" APM. BPM and APM are defined by the proximity-based system dynamically, statically in advance, or both, to have distance values smaller than respective dl, d2, and d3 values. BPMl 74 and APMl 76 are associated with PM1 70; whereas BPM2 78 and APM2 79 are associated with PM2 72. The functionality and interaction of BPMl 74, APMl 76, BPM2 78, and APM2 79 are described further hereinbelow. By way of example, when mobile device 25 is within distance D of object 30 and within sub-distance dl (corresponding to information level 1) the output unit display may change or a sound may be played— as described hereinabove. Additionally, when mobile device 26 is within distance D of object 30 and within sub-distance d2 (corresponding to information level 2) the output unit display may change to an alternate display or an alternate sound may be played.
However, as the mobile device is moved (or the object moves in relation to the mobile device) and PM1 is effectively "crossed"— i.e., the sub- distance changes from dl to d2 or vice versa, there could be jitter in the information level output to the user due to inadvertent movement (or digital/communication sources yielding a scenario similar to physical movement) relatively close to PM1. In examples below, it is assumed that the object remains stationary and the mobile device is moved— however a change in calculated distance is central to the following discussion; so that either movement of the mobile device or of the object or of both the mobile device or any other factor effecting a calculation are possible to effect the distances D, dl, d2, etc.
Reference is currently made to FIGS 7, 8, and 9, which are schematic diagrams of the proximity marker and functionality of the anti-jitter subsystem of FIG 6, in accordance with embodiments of the present invention. Apart from differences described below, PM1 70, BPM1 74 , and APM1 76 are identical in notation, configuration, and functionality to that shown in 6, and elements indicated by the same reference numerals and/or letters are generally identical in configuration, operation, and functionality as described hereinabove in previous figures.
In FIG 7, an initial position of the mobile device is indicated 81 in view (a). If, as shown in view (b), the next position 83 reflects a position of the mobile device just to the right of PMl 70 (and into region d2), a jitter condition could exist for outputs associated with information level 1 and information level 2 due to any of the reasons noted hereinabove. To avoid a jitter condition and to contribute to more stability, the system mathematically moves PMl 70 to a new position PMl ' 85 to ensure next position 83 is sufficiently distanced from PMl '.
In FIG 8, view (a), an initial position 85 indicates the mobile device positioned within BPMl 74 but close to PMl 70. To avoid a jitter condition to contribute to more stability— ref view (b), the system mathematically moves the initial position to a new initial position 86, away from PMl 70 and out of BPMl 74. Optionally or alternatively, in view (a) the initial position is maintained, but PMl 70 , along with the associated BPM and APM, are moved to a new position to the right (not shown in the current figure), yielding an effectively similar relative separation between the initial position, the PMl, APM, and BPM as shown in view (b).
In FIG 9, in view (a), an initial position 88 situates the mobile device within APMl 76 but close to PMl 70. To avoid a jitter condition to contribute to more stability, the system mathematically moves the initial position to a new initial position 89, into the region of dl, away from PMl 70 and out of BPM 1 74. Optionally or alternatively— and similarly to the description of FIG 8 hereinabove, PMl 70 , along with associated BPM 74 and APM 76, are moved to a new position to the right (not shown in the current figure), yielding an effectively similar relative separation between the initial position, the PMl, APM, and BPM as shown in view (b).
As a default, whenever a potential jitter situation exists (based on relative positions of the proximity markers, associated BPM, APM and the mobile device) system acts to stabilize and adjusts for a "wider-angle view" of the object— meaning a configuration wherein the initial position is relatively biased with regard to the PMl into region dl and out of the BPM— thereby yielding "information layer 1" user feedback However, embodiments of the current invention optionally or alternatively include the system acting to stabilize and adjust for a "narrow-angle view" of the object ~ a configuration wherein the initial position is relatively biased with regard to the PMl into region dl and out of the APM— thereby yielding "information layer 2" user feedback.
Exemplary applications of the proximity-based information system disclosed hereinabove in embodiments of the current invention include, but not limited to, the following. Smart phones/Tablets
Smartphones and Tablet computers comprise: a LCD screen, a camera and a network connection that allow them to support the embodiment of this invention. The camera can be utilized for object recognition and proximity measurement. A user with a smartphone/tablet and the proximity based interface system installed on it can visit a museum, aim the smartphone/tablet at one of the objects, or paintings on the wall, depending on the distance between the smartphone/tablet and the object or painting, digital content will appear on the smartphone's display in relation to the object or painting.
Augmented Reality Glasses
AR Glasses are comprised of lenses that have the capability of displaying digital content, camera and a network connection that allow them to support the embodiment of this invention. The camera can be utilized for object recognition and proximity measurement.
A user wearing the AR Glasses is able to walk through his town while information about the surroundings appear on the lenses of the glasses as he continues to move through his surroundings. The digital content displayed on the lenses of the glasses about the objects the user looks at changes as the user moves closer or further away from the objects; all while the user is able to see his surroundings through the transparent lenses. It will be appreciated that the above descriptions are intended only to serve as examples, and that many other embodiments are possible within the scope of the present invention as defined in the appended claims.

Claims

1. A proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising:
at least one input unit configured to sense an object, and adapted to provide sensing information;
a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to:
receive the sensing information from the at least one input unit, recognize the object,
measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity; and
at least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer.
2. A system, according to claim 1, wherein an anti-jitter subsystem is adapted to select an information layer.
3. A system according to claim 2, wherein the at least one input unit is integrated in the mobile device.
4 A system according to claim 2, wherein the at least one input unit is positionable remote from the mobile device.
5. A system according to claim 2, wherein the database is integrated in the mobile device.
6. A system according to claim 2, wherein the database is positionable remote from the mobile device.
7. A method of enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the method comprising the steps of:
configuring at least one input unit to sense an object and to provide sensing information;
taking a control unit to communicate with a database having
information layers about the object associated with respective object distances, the control unit:
receiving the sensing information from the at least one input unit, recognizing the object, measuring the proximity from the mobile device to the object, and
selecting an information layer from the database based on the measured proximity; and
configuring at least one output unit to communicate with the control unit, and to provide output to the user based upon the selected information layer.
8. A method according to claim 7, wherein selecting an information layer is further performed using an anti-jitter subsystem.
9. A proximity-based interface system enabling user interaction with an object using a mobile device dependent on a proximity between the object and the mobile device, the system comprising:
at least one input unit configured to sense an object, and adapted to provide sensing information;
a control unit having communication with a database having information layers about the object associated with respective object distances, the control unit adapted to:
receive the sensing information from the at least one input unit, recognize the object,
measure the proximity from the mobile device to the object, and select an information layer from the database based on the measured proximity;
an anti-jitter subsystem associated with the control unit, the anti-jitter subsystem adapted to further select an information level based upon a change of the measured proximity; and
at least one output unit configured to communicate with the control unit, and adapted to provide output to the user based upon the selected information layer.
PCT/IL2013/000039 2012-04-16 2013-04-15 Proximity-based interface system and method WO2013156987A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261624452P 2012-04-16 2012-04-16
US61/624,452 2012-04-16

Publications (1)

Publication Number Publication Date
WO2013156987A1 true WO2013156987A1 (en) 2013-10-24

Family

ID=49383013

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2013/000039 WO2013156987A1 (en) 2012-04-16 2013-04-15 Proximity-based interface system and method

Country Status (1)

Country Link
WO (1) WO2013156987A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080144549A1 (en) * 2006-12-14 2008-06-19 Todd Marques Wireless Proximity-Based Information System
US20110164163A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
EP2400733A1 (en) * 2010-06-28 2011-12-28 Lg Electronics Inc. Mobile terminal for displaying augmented-reality information
WO2012001218A1 (en) * 2010-06-30 2012-01-05 Nokia Corporation Methods, apparatuses and computer program products for providing a constant level of information in augmented reality
WO2012038589A1 (en) * 2010-09-22 2012-03-29 Nokia Corporation Apparatus and method for proximity based input

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080144549A1 (en) * 2006-12-14 2008-06-19 Todd Marques Wireless Proximity-Based Information System
US20110164163A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
EP2400733A1 (en) * 2010-06-28 2011-12-28 Lg Electronics Inc. Mobile terminal for displaying augmented-reality information
WO2012001218A1 (en) * 2010-06-30 2012-01-05 Nokia Corporation Methods, apparatuses and computer program products for providing a constant level of information in augmented reality
WO2012038589A1 (en) * 2010-09-22 2012-03-29 Nokia Corporation Apparatus and method for proximity based input

Similar Documents

Publication Publication Date Title
US8762895B2 (en) Camera zoom indicator in mobile devices
US11922588B2 (en) Cooperative augmented reality map interface
US8938558B2 (en) Modifying functionality based on distances between devices
WO2018236499A1 (en) Augmented reality interface for interacting with displayed maps
Fröhlich et al. On the move, wirelessly connected to the world
KR100732524B1 (en) Mobile Communication Terminal With Displaying Information Continuously And Its Method
KR102639605B1 (en) Systems and methods for relative representation of spatial objects and disambiguation in an interface
KR102402048B1 (en) Electronic apparatus and the controlling method thereof
JP6429886B2 (en) Touch control system and touch control method
CN104081307A (en) Image processing apparatus, image processing method, and program
WO2014194513A9 (en) A method and apparatus for self-adaptively visualizing location based digital information
CN106846496A (en) DICOM images based on mixed reality technology check system and operating method
JPWO2015159602A1 (en) Information provision device
JPWO2019069575A1 (en) Information processing equipment, information processing methods and programs
WO2017025663A1 (en) Searching image content
KR101568741B1 (en) Information System based on mobile augmented reality
CN111158556A (en) Display control method and electronic equipment
WO2013156987A1 (en) Proximity-based interface system and method
WO2017033544A1 (en) Information processing device, information processing method, and program
TW201106210A (en) Large scale picture browsing apparatus and method thereof
WO2013175341A2 (en) Method and apparatus for controlling multiple devices
JP7386583B1 (en) Program, information processing device and method
KR102075357B1 (en) Method for searching based pixel
KR101914205B1 (en) Apparatus of processing user interface
CN117631817A (en) Measurement method, measurement device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13778909

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC OF 120215

122 Ep: pct application non-entry in european phase

Ref document number: 13778909

Country of ref document: EP

Kind code of ref document: A1