US20170132896A1 - Notification System For Providing Awareness Of An Interactive Surface - Google Patents

Notification System For Providing Awareness Of An Interactive Surface Download PDF

Info

Publication number
US20170132896A1
US20170132896A1 US15/412,183 US201715412183A US2017132896A1 US 20170132896 A1 US20170132896 A1 US 20170132896A1 US 201715412183 A US201715412183 A US 201715412183A US 2017132896 A1 US2017132896 A1 US 2017132896A1
Authority
US
United States
Prior art keywords
trajectory
glass surface
user electronic
interactive surface
collide
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/412,183
Other versions
US9911298B2 (en
Inventor
James H. Pratt
Steven M. Belz
Marc A. Sullivan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AT&T Intellectual Property I LP
Original Assignee
AT&T Intellectual Property I LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AT&T Intellectual Property I LP filed Critical AT&T Intellectual Property I LP
Priority to US15/412,183 priority Critical patent/US9911298B2/en
Assigned to AT&T INTELLECTUAL PROPERTY I, L.P. reassignment AT&T INTELLECTUAL PROPERTY I, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BELZ, STEVEN M., PRATT, JAMES H., SULLIVAN, MARC A.
Publication of US20170132896A1 publication Critical patent/US20170132896A1/en
Application granted granted Critical
Publication of US9911298B2 publication Critical patent/US9911298B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Definitions

  • the present application relates to collision warning systems, and more particularly to a system and method for preventing a collision with an interactive surface.
  • network-connected computers can display computer interfaces on a glass table for users sitting at the table during a meeting; glass walls can have interfaces that users can interact with to perform a variety of functions; sliding glass doors in homes can have computer displays that users can interact with, and patient bedrooms in hospitals can have glass walls that doctors can use to view medical information.
  • a system and accompanying methods for providing awareness of an interactive surface are disclosed.
  • the system may be configured to generate various types of notifications or perform other actions if a person or an object has a trajectory that indicates that the person or object will collide with an interactive surface.
  • various types of notifications may include, but are not limited to, one or more of adjusting the opacity of the interactive surface, presenting different types of messages on the glass surface, emitting sounds and vibrations, and displaying various types of images or colors on the glass surface.
  • the trajectory of the person or object may be determined based on determining position and velocity by using one or more cameras, sensors, location positioning systems, and other systems and devices that can capture and process information about the person or the object.
  • the notifications may increase in intensity.
  • the system may cause the notifications to decrease in intensity or remove the notifications altogether.
  • the system for preventing a collision with an interactive surface may include a memory for storing instructions and a processor that is communicatively coupled to the interactive surface.
  • the processor may execute the instructions to perform operations comprising receiving media content associated with an object that is within a range of the interactive surface. Additionally, the processor may be configured to detect a position and a velocity of the object that is in the range of the interactive surface. The processor may detect the position and the velocity based at least in part on the recorded media content. Also, the processor may determine if the object has a trajectory that would cause the object to collide with the interactive surface. The trajectory may be determined based on the position and the velocity of the object. Furthermore, the processor may generate a notification if the object is determined to have a trajectory that would cause the object to collide with the interactive surface.
  • a method for providing awareness of an interactive surface may include receiving media content associated with an object that is within a range of the interactive surface. Additionally, the method may include determining a position and a velocity of the object that is in the range of the interactive surface based at least in part on the media content.
  • a processor may be configured to determine the position and the velocity of the object by executing instructions stored in a memory.
  • the method may also include determining if the object has a trajectory that would cause the object to collide with the interactive surface. The trajectory of the object may be determined based on the position and the velocity of the object.
  • the method may include generating a notification if the object is determined to have the trajectory that would cause the object to collide with the interactive surface.
  • a tangible computer-readable medium comprising instructions for providing awareness of an interactive surface
  • the computer instructions when loaded and executed by a processor, may cause the processor to perform operations including the following: receiving media content associated with an object that is within a range of an interactive surface; determining a position and a velocity of the object that is in the range of the interactive surface based at least in part on the media content; determining if the object has a trajectory that would cause the object to collide with the interactive surface, wherein the trajectory is determined based on the position and the velocity of the object; and generating a notification if the object is determined to have the trajectory that would cause the object to collide with the interactive surface.
  • FIG. 1A is a schematic diagram illustrating a person that is approaching an interactive surface of an awareness system according to the present disclosure.
  • FIG. 1B is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered various notifications from the interactive surface of FIG. 1 .
  • FIG. 1C is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered more intense notifications from the interactive surface of FIG. 1 .
  • FIG. 2A is a schematic diagram illustrating a person that is approaching an interactive surface of another embodiment of the awareness system.
  • FIG. 2B is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered a notification from the interactive surface of FIG. 4 .
  • FIG. 2C is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered more intense notifications from the interactive surface of FIG. 4 .
  • FIG. 3 is a flow diagram that illustrates a sample method for receiving various types of notifications based on the person's trajectory with respect to an interactive surface according to the present disclosure.
  • FIG. 4 is a flow diagram illustrating another sample method for receiving various types of notifications based on a person's trajectory with respect to an interactive surface.
  • FIG. 5 is a flow diagram illustrating another method for providing awareness of an interactive surface according to an embodiment of the present disclosure.
  • FIG. 6 is a diagrammatic representation of a machine in the form of a computer system within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed herein.
  • a system 100 for providing awareness of an interactive surface 110 is disclosed in the present disclosure.
  • the system 100 alerts a person to the presence of an interactive surface 110 to prevent a collision with the interactive surface 110 .
  • the system 100 for providing awareness of an interactive surface 110 may be configured to generate various types of notifications or perform other actions if a user or an object 105 has a trajectory that indicates that the user or object will collide with an interactive surface 110 or other surface.
  • various types of notifications may include, but are not limited to including, one or more of adjusting the opacity of the interactive surface 110 , presenting different types of messages on or near the interactive surface 110 , emitting sounds and vibrations from or around the interactive surface 110 , or displaying various types of images or colors on or near the interactive surface 110 .
  • the system 100 may determine the trajectory of the person or object 105 based on determining the person or object's position and velocity by using one or more image capture devices 120 , sensors, location positioning systems, and other systems and devices that can retrieve or process information, or both, relating to the person or the object 105 . If the person or object 105 continues on the trajectory to collide with the interactive surface 110 , the system 100 may increase the intensity of the notifications or actions. However, if the person or object 105 adjusts its trajectory so that the person or object is no longer on a path to collide with the interactive surface 110 , the system 100 may decrease the intensity of the notifications or even remove the notifications altogether. Also, the system 100 may utilize sensors to determine one or more of air movement, sounds, vibrations or other detectable information associated with the person or the object 105 , and may generate notifications based on such information.
  • FIGS. 1-6 illustrate specific example configurations of the various components of the system 100
  • the system 100 may include any configuration of the components, which may include using a greater or lesser number of the components.
  • the system 100 may include multiple servers 150 , multiple location determination devices 115 , and multiple image capture devices 120 .
  • the system 100 may be configured to not include the air movement sensor 125 , but may include the sound sensor 130 and the vibration sensor 135 .
  • Object 105 in FIGS. 1A-2C is shown as being a person that is approaching the interactive surface 110 .
  • the object 105 is not limited to being a person, but may also be any animal or thing that could potentially collide with the interactive surface 110 .
  • the object 105 may be anything that may be sensed by the various components of the system 100 .
  • the interactive surface 110 may be any device that a user or potentially an animal could interact with and may be configured to display a wide range of notifications to users. As shown in FIGS. 1A-2C , the interactive surface 110 may be an interactive glass display, a computer, a mobile device, a tablet device, a wall display, an interactive table display, any interactive monitor or television display, any touchscreen device having a display, a plasma display, a liquid crystal display, a 3-dimensional (3-D), or any other device that has a display. In one embodiment, the interactive surface 110 may be an electrochromic device, a suspended particle device, a polymer dispersed liquid crystal device, mechanical smart windows, or a device having the capability of changing its opacity, color, visual appearance, or temperature, or a combination thereof.
  • the interactive surface 110 may include or otherwise be communicatively linked to one or more electronic processors 112 that may be configured to perform a variety of operations in the system 100 .
  • the electronic processors 112 may perform the operations in the system 100 or may direct various components in communication with the interactive surface 110 or other components of the system 100 to perform the operations, or both.
  • the electronic processors 112 may direct the electronic processors 116 of the location determination device 115 to determine position and velocity information for the object 105 .
  • the electronic processors 112 may direct the image capture device 120 to capture media content associated with the object 105 .
  • the interactive surface 110 may be configured to display colors, adjust its opacity, display images and messages, display user-interactive programs, emit sounds of various pitches, volumes, and timbres, emit vibrations, adjust its temperature, receive input from users, display outputs, or perform any computer operations by utilizing the electronic processor 112 or other means. Additionally, the interactive surface 110 may be communicatively linked to one or more of the location determination device 115 , the image capture device 120 , the air movement sensor 125 , the sound sensor 130 , the vibration sensor 135 , the ultrasound device 137 , the mobile device 140 , the communications network 145 , the server 150 , and the database 155 .
  • the location determination device 115 may be any device that may be utilized to determine a position or a velocity, or both, of the object 105 , which, in FIGS. 1-6 , is a person.
  • the location determination device 115 may include or otherwise be communicatively linked to one or more electronic processors 116 that may be configured to perform a variety of operations.
  • the electronic processors 116 may perform the operations in the system 100 or may direct various components in communication with location determination device 116 or other components of the system 100 to perform the operations, or both.
  • the electronic processors 116 may be software, hardware, or a combination of hardware and software.
  • the location determination device 115 may include a global positioning system, a device that determines location by triangulation, or any device that can determine the position or velocity of the object 105 , or both. Notably, the location determination device 115 may be configured to determine a current position or a current velocity of the object 105 , or both, with respect to the position of the interactive surface 110 . Furthermore, the location determination device 115 may be communicatively linked with any of the components in the system 100 and can transmit data, such as the position or velocity data, or both together, with any other necessary information, to any of the components in the system 100 .
  • the location determination device 115 may be placed anywhere that the location determination device 115 can effectively capture information associated with the object 105 . In one embodiment, the location determination device 115 may be located in close proximity to or even within the interactive surface 110 itself.
  • the image capture device 120 of the system 100 may be a camera, such as, but not limited to, a video camera or other type of surveillance device that may be utilized to capture and record media content associated with the object 105 .
  • the media content can include visual content, audio content, content associated with vibrations that the object 105 makes, and other content.
  • the image capture device 120 can record video of the person and any sounds that the person makes when the person is in range of the interactive surface 110 or the system 100 .
  • the image capture device 120 may record sounds by utilizing a microphone, which may reside within the interactive surface 110 or in proximity to the interactive surface 110 .
  • the image capture device 120 may be configured to determine the position, size, and the velocity of the object 150 based on the media content that the image capture device 120 records. In another embodiment, the image capture device 120 may be able to generate a 3-D image that can be utilized to show where and how fast the object 105 is moving with respect to the interactive surface 110 . Furthermore, the image capture device 120 may be communicatively linked with any of the components in the system 100 . Although the image capture device 120 is illustratively shown above the interactive surface 110 , the image capture device 120 may be placed anywhere that the image capture device 120 can effectively capture media content associated with the object 105 . In an embodiment, the image capture device 120 may be a component of the interactive surface 110 itself, and may be positioned within the interactive surface 110 or outside of the interactive surface 110 .
  • the air movement sensor 125 of the system 100 may be configured to detect changes in air movement, which may be generated by the object 105 , that are in the general vicinity of the interactive surface 110 . For example, if a person is running towards the interactive surface 110 , the air movement sensor 125 can detect changes in air movement caused by the person's running. Additionally, the sound sensor 130 may be configured to detect changes in sound in the general area of the interactive surface 110 that may be caused by the object 105 . For example, if a person is making sounds near the interactive surface 110 , the sound sensor 103 can detect the sounds that the person makes. Furthermore, the vibration sensor 135 may be configured to detect changes in vibrations in the general area of the interactive surface 110 that may be caused by the object 105 . For example, the vibrations that a person makes while walking or running towards the interactive surface 110 may be detected by the vibration sensor 135 and may be used by the system 100 to determine where the person is and how fast the person is moving.
  • One or more of the air movement sensor 125 , the sound sensor 130 , and the vibration sensor 135 may be configured to transmit detected information to any of the components of the system 100 and may be configured to include electronic processors. Also, the information detected by the air movement sensor 125 , the sound sensor 130 , and the vibration sensor 135 may be utilized in determining the position and velocity of the object 105 . Although the air movement sensor 125 , the sound sensor 130 , and the vibration sensor 135 are illustratively shown on the interactive surface 110 , the sensors may be placed anywhere that the sensors can effectively detect the information about the object 105 .
  • the system 100 may also include an ultrasound device 137 , which can be configured to emit an ultrasonic sound in the direction of the object 105 . Once the ultrasonic sound is emitted, an echo may be returned from the object 105 based on the ultrasonic sound.
  • the echo may be utilized by the system 100 to determine various information about the object 105 such as, but not limited to, the size, shape, position, and velocity of the object 105 . Additionally, the echo may be utilized by the system 100 to generate an image, such as a 3-D image of the object 105 so that the interactive surface 110 can determine what is approaching it and display any necessary warning notifications.
  • the mobile device 140 may be cellular telephone, a tablet, a personal computer, or any other similar device that may be utilized by a person that is approaching the interactive surface 110 .
  • the mobile device 140 may be communicatively linked with any of the components in the system 100 and may transmit position or velocity data, or both, associated with the object 105 to the interactive surface 110 or other components in the system 100 .
  • position information, identification information, velocity information, or other types of information associated with the mobile device 140 or the person 105 , or both may be transmitted to the system 100 by the GPS.
  • the mobile device 140 may utilize triangulation or other location determination mechanisms to determine the position and velocity of the objection 105 .
  • the information transmitted to the system 100 may be utilized in determining a trajectory of the object 105 with respect to the interactive surface 110 .
  • the communications network 145 may be any other suitable network that may be utilized to allow the various components of the system 100 to communicate with one another.
  • the communications network 145 may be a wireless network, an ethernet network, a satellite network, a broadband network, a cellular network, a private network, a cable network, the Internet, or any other network.
  • the server 150 may include an electronic processor and may be configured to handle any necessary processing for carrying out the various operative functions of the system 100 .
  • the server 150 may receive data from the interactive surface 110 , the location determination device 115 , the image capture device 120 , the air movement sensor 125 , the sound sensor 130 , the vibration sensor 135 , the mobile device 140 , the communications network 145 , the ultrasound device 137 , and the database 155 to determine the precise location of the object 105 .
  • any of the electronic processors disclosed herein may perform the operations in the system 100 or may direct various components in communication with the components of the system 100 to perform the operations, or both.
  • the server 150 may be configured to transmit signals to the interactive surface 110 so that the interactive surface 110 can display notifications.
  • the server 150 may send a signal to the interactive surface to display a message on the interactive surface 110 that can warn the person of the impending collision. Furthermore, any and all data that traverses the system 100 may be stored in the database 155 .
  • the system 100 may be configured to provide awareness of the interactive surface 110 in a variety of ways.
  • an object 105 in this case a person
  • the interactive surface 110 is a large interactive glass display.
  • the person is illustratively shown at a location far away from the interactive surface 110 .
  • One or more of the location determination device 115 , the image capture device 120 , the various sensors 125 , 130 , and 135 , the ultrasound device 137 , the mobile device 140 , the communications network 145 , the server 150 , and the database 155 can all work in concert with one another to determine the person's position and velocity with respect to the interactive surface 110 . Notably, not all of these devices are required to determine the position and velocity of the person, but rather any subset of the devices in the system 100 may be utilized to determine the person's position and velocity.
  • the system 100 via server 150 or any other device in the system 100 , can determine a trajectory of the person with respect to the interactive surface 110 . If the trajectory indicates that a collision will occur, the system 100 may send a signal to the interactive surface 110 to display or emit a notification that can be perceived by the person.
  • the notifications may include, but are not limited to including, a displayed image or message, a change in color of the interactive surface 110 , emitted sounds and vibrations, or a change in the opacity of the interactive surface 110 .
  • the person is far enough away from the interactive surface 110 that the system 100 does not cause the interactive surface 110 to generate a notification and the interactive surface 110 can remain in a transparent state.
  • the person is shown at a location that is closer to the interactive surface 110 than the person is in FIG. 1A .
  • the various devices in the system 100 may determine that the person is located at a position and is moving at a velocity such that the person may eventually collide with the interface surface 110 . If the person is determined to be moving on a trajectory to eventually collide with the interface surface 110 based on his or her position or velocity, then system 100 may generate a notification and transmit the notification to the interface surface 110 .
  • the notification may be a message such as “Warning! Nearing Glass Surface!,” or another message that can notify the person of the presence of the interface surface 110 .
  • the system 100 can cause the interface surface 110 to emit sounds and vibrations to notify the person of the presence of the interface surface 110 as well.
  • the person may either change his or her trajectory based on the notifications displayed or emitted by the interface surface 110 , or may continue to proceed on the trajectory towards the interface surface 110 . If the system determines that the person stops moving, or moves in an alternate trajectory that would not cause the person to collide with the interface surface 110 , the system 100 may remove any displayed notifications or cause any emitted warning sounds and vibrations to stop.
  • system 100 may transmit a signal to the interface surface 110 to display a more intense message such as “Warning! You Are About To Collide With This Glass Surface!,” or another message that can urgently notify the person of the presence of the interface surface 110 .
  • the system 100 can cause the interface surface 110 to emit sounds and vibrations of increased intensity to notify the person of the presence of the interface surface 110 . Ultimately, after sensing or seeing the notifications of increased intensity, the person will change their trajectory to avoid a collision with the interface surface 110 .
  • the interface surface 110 can either reduce the intensity of the notifications or remove the notifications altogether.
  • the interface surface 110 can display a program that the person may want to use or display instructions to teach the user how to use the interface surface 110 .
  • an object 105 which is shown as being a person, is approaching the interactive surface 110 , which in this case is a large interactive glass display.
  • the person is shown at a location far from the interactive surface 110 .
  • the various components of the system 100 can all work in concert together to determine the person's position and velocity with respect to the interactive surface 110 .
  • not all of the devices in the system 100 are required to determine the position and velocity of the person, but rather any subset of the devices in the system 100 may be utilized to determine the person's position and velocity.
  • the system 100 via server 150 or any other device in the system 100 , can determine the trajectory of the person with respect to the interactive surface 110 . If the trajectory indicates that a collision will occur, the system 100 may send a signal to the interactive surface 110 to display or emit a notification that can be perceived by the person. In FIG. 2A , the person has been determined to be far enough away from the interactive surface 110 that the system 100 does not cause the interactive surface 110 to generate a notification. As a result, the interactive surface 110 remains in a transparent state.
  • the person is shown at a location that is closer to the interactive surface 110 than the person is in FIG. 2A and closer than a threshold distance.
  • the system 100 may determine that the person has a position and velocity such that the person may collide with the interface surface 110 . If the person is determined to be moving on a trajectory to eventually collide with the interface surface 110 based on his or her position or velocity, then system 100 may transmit a signal to the interface surface 110 to adjust its opacity or color so that the system 100 that can alert the person of the presence of the interface surface 110 .
  • the interface surface 110 is more opaque than the interface surface 110 illustrated in FIG. 2A .
  • the system 100 can cause the interface surface 110 to emit sounds or vibrations, or both, to notify the person of the location of the interface surface 110 .
  • the person may either change his or her trajectory or may continue to proceed on the trajectory towards the interface surface 110 . If the system determines that the person is no longer on a trajectory to collide with the interface surface 110 , the system 100 may cause the interface surface 110 to remove or reduce any notifications. However, if the person continues to move on the trajectory to collide with the interface surface 110 as illustrated in FIG. 2C , then the system 100 may transmit a signal to the interface surface 110 to increase the opacity of the interface surface 110 to further notify the person of the presence of the interface surface 110 .
  • the interface surface 110 may be given a signal by the system 100 to display a darker or brighter color on the interactive surface 110 .
  • the interface surface 110 is more opaque than the interface surface 110 illustrated in FIG. 2B .
  • the system 100 can cause the interface surface 110 to emit sounds or vibrations, or both, of increased intensity to notify the person of the interface surface 110 . Hopefully, after noticing the notifications of increased intensity, the person will change their trajectory to avoid colliding with the interface surface 110 .
  • the interface surface 110 can either reduce the intensity of the notifications or remove the notifications.
  • the interface surface 110 can display a program that the person may want to use or even display instructions that can teach the user how to use the interface surface 110 .
  • An exemplary method 300 for providing awareness of an interactive surface 110 as shown in FIG. 3 involves a situation where a child is running around a family home.
  • the method 300 may include, at step 305 , having a child running around his family's house. While the child is running around the house, the system 100 may determine, at step 310 , whether the child is in the range of the interactive surface 110 , which in this case may be an interactive glass display. At step 315 , the system 100 may determine that the child is on a trajectory to collide with the interactive surface 110 . Once the system 100 determines that the child is on the trajectory to collide with the interactive surface 110 , the system 100 can transmit a signal to the interactive surface 110 to cause the interactive surface 110 to increase its opacity.
  • the system 100 can send a signal to the interactive surface 110 to generate and vibrate a low-pitch tone at step 325 .
  • the child can see or hear, or both, the interactive surface 110 based on the increased opacity and the low-pitch tone.
  • the system 100 can determine that the child is on a new trajectory that would not cause a collision with the interactive surface 110 .
  • the system 100 can send signals to the interactive surface 110 become transparent once again and to stop vibrating the tone.
  • Another exemplary method 400 for providing awareness of an interactive surface 110 involves a situation where a child does not take the first warning from the interactive surface 110 .
  • the method 400 may include, at step 405 , having the child running around his family's house as in the previous example. While the child is running around the house, the system 100 may determine, at step 410 , the child's movement information, such as his position, velocity, and trajectory. At step 415 , the system 100 may determine that the child is getting too close to the interactive surface 110 . As a result, the method 400 may include having the system 100 send a signal to the interactive surface 110 to increase its opacity such that the interactive surface 110 is 20% white at step 420 . At step 425 , the system 100 can cause the interface surface 110 to emit a 700 hertz (Hz) vibration.
  • Hz hertz
  • the system 100 can determine that, despite the initial notifications, the child is getting even closer to the interactive surface 110 . Now, the system 100 can send another signal to the interactive surface 110 to increase its opacity such that the interactive surface 110 is now 40% white at step 435 . Additionally, at step 440 , the system 100 can cause the interface surface 110 to adjust the 700 Hz vibration to a 1400 Hz vibration. At this point, the child may see or hear, or both, the interactive surface 110 at step 445 . At step 450 , the system 100 can determine that a collision with the interactive surface 110 has been avoided because the child has changed his trajectory.
  • the child can run elsewhere and the system 100 can determine the child's new position, velocity, and trajectory and find that the child is no longer at risk of a collision.
  • the system 100 can transmit a signal to the interactive surface 110 to return to a clear or transparent state.
  • the system 100 can transmit a signal to the interactive surface 110 to stop emitting vibrations at step 465 .
  • the method 500 may include, at step 502 , recording media content associated with an object 105 that is in the range of an interactive surface 110 .
  • the media content can be sound content, visual content, or vibrations, any combination thereof, or any other content.
  • the media content may be recorded, for example, by the image capture device 120 or other appropriate device.
  • the method 500 may include determining a position and a velocity of the object 105 based on the media content or on any of the information gathered by any of the components in the system 100 , or any combination thereof.
  • the position and the velocity may be determined by the location determination device 115 , the interactive surface 110 , the server 150 , the computer system 600 , any combination thereof, or other appropriate device.
  • the method 500 may include, at step 506 , determining if the object 105 has a trajectory that would cause the object to collide with the interactive surface 110 based on the position and velocity of the objection 105 .
  • the trajectory may also be determined based on sensed vibrations, sounds, images, or other information gathered by the components of the system 100 .
  • the trajectory may be determined by the location determination device 115 , the interactive surface 110 , the server 150 , the computer system 600 , any combination thereof, or other appropriate device.
  • the method 500 may include generating a notification on or near the interactive surface 110 to indicate the location of the interactive service 110 at step 510 via the location determination device 115 , the interactive surface 110 , the server 150 , the computer system 600 , any combination thereof, or other appropriate device.
  • the method 500 may include determining if the object 105 is still on the trajectory to collide with the interactive surface 110 despite the notifications.
  • Step 512 may be performed by the location determination device 115 , the interactive surface 110 , the server 150 , the computer system 600 , any combination thereof, or other appropriate device. If the object 105 is no longer on course to collide with the interactive surface after the initial notifications, the method 500 may include reducing or removing the notifications from or around the interface surface 110 at step 514 . However, if the object 105 is still on course for a collision with the interface surface 110 , then the method 500 may include increasing the intensity or number of notifications, or both, so that the object 105 hopefully changes its trajectory at step 516 . In one embodiment, steps 514 and 516 may be performed by the interactive surface 110 , the server 150 , the computer system 600 , any combination thereof, or other appropriate device.
  • the system 100 and methods described herein may include presenting the notifications only on a portion of the interactive surface 110 . For instance, while notifications are presented on some portions, other portions of the interface surface 110 can remain in normal operational mode.
  • the systems and methods may include enabling a person to use the interactive surface 110 as a whiteboard. For example, if the system 100 causes the interactive surface 110 to become opaque as a notification to a person approaching the interactive surface 110 , the opaque portion may be used as a whiteboard by the person.
  • the trajectory calculated by the system 100 may be a predicted path that the person or object 105 may be determined to be following.
  • the system 100 and methods may include providing a specific sequence or pattern of notifications based on the situation that presents itself.
  • the system 100 may cause the interactive surface 110 to emit a specific low sound, multi-pulse sequence that can indicate when the interactive surface 110 can be utilized as a computer display, whiteboard, or for certain other purposes.
  • the system 100 and methods may include having the interactive surface 110 visually or orally identify its functions to users and any data to which it has access.
  • the system 100 and methods can utilize infrared technology, motion detectors, laser detectors, or any other type of detector to assist in determine a position, velocity, or other information associated with an object 105 .
  • the method 500 may include determining the positions and velocities of multiple objects 105 .
  • trajectories may be determined either for a single object 104 or for multiple objects 105 and notifications may be generated so that they are tailored to each object 105 that is approaching the interactive interface. It is important to note that the methods described above may incorporate any of the functionality, devices, or features of the systems described above, or otherwise, and are not intended to be limited to the description or examples provided herein.
  • the methodologies and techniques described with respect to the exemplary embodiments can incorporate a machine, such as, but not limited to, computer system 600 , or other computing device within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies or functions discussed above.
  • the machine may be configured to facilitate various operations conducted by the system 100 .
  • the machine may be configured to, but is not limited to, assist the system 100 by providing processing power to assist with processing loads experienced in the system 100 , by providing storage capacity for storing instructions or data traversing the system 100 , by capturing media content, or by assisting with any other operations conducted by or within the system 100 .
  • the machine operates as a standalone device.
  • the machine may be connected (e.g., using a network 145 ) to and assist with operations performed by other machines, such as, but not limited to, the interactive surface 110 , the location determination device 115 , the image capture device 120 , the air movement sensor 125 , the sound sensor 130 , the vibration sensor 135 , the ultrasound device 137 , the mobile device 140 , the server 150 , and the database 155 , or any combination thereof.
  • the machine may be connected with any component in the system 100 .
  • the machine may operate in the capacity of a server or a client user machine in server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may comprise a server computer, a client user computer, a personal computer (PC), a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • tablet PC tablet PC
  • laptop computer a laptop computer
  • desktop computer a control system
  • a network router, switch or bridge or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • the computer system 600 may include a processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU, or both), a main memory 604 and a static memory 604 , which communicate with each other via a bus 608 .
  • the computer system 600 may further include a video display unit 610 (e.g., a liquid crystal display (LCD), a flat panel, a solid state display, or a cathode ray tube (CRT)).
  • the computer system 600 may include an input device 612 (e.g., a keyboard), a cursor control device 614 (e.g., a mouse), a disk drive unit 616 , a signal generation device 618 (e.g., a speaker or remote control) and a network interface device 620 .
  • an input device 612 e.g., a keyboard
  • a cursor control device 614 e.g., a mouse
  • a disk drive unit 616 e.g., a disk drive unit
  • a signal generation device 618 e.g., a speaker or remote control
  • the disk drive unit 616 may include a machine-readable medium 622 on which is stored one or more sets of instructions 624 (e.g., software) embodying any one or more of the methodologies or functions described herein, including those methods illustrated above.
  • the instructions 624 may also reside, completely or at least partially, within the main memory 604 , the static memory 606 , or within the processor 602 , or a combination thereof, during execution thereof by the computer system 600 .
  • the main memory 604 and the processor 602 also may constitute machine-readable media.
  • Dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices can likewise be constructed to implement the methods described herein.
  • Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems. Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit.
  • the example system is applicable to software, firmware, and hardware implementations.
  • the methods described herein are intended for operation as software programs running on a computer processor.
  • software implementations can include, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
  • the present disclosure contemplates a machine readable medium containing instructions 624 so that a device connected to the communications network 145 can send or receive voice, video or data, and to communicate over the network 145 using the instructions 624 .
  • the instructions 624 may further be transmitted or received over the network 145 via the network interface device 620 .
  • machine-readable medium 622 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • machine-readable medium shall accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.

Abstract

A system for providing awareness of an interactive surface is disclosed. The system may include a processor that is communicatively linked to an interactive surface. The processor may determine a position and a velocity of an object that is within range of the interactive surface based on one or more of media content, vibrations, air movement, sounds and, global positioning data associated with the object. Additionally, the processor may determine if the object has a trajectory that would cause the object to collide with the interactive surface based on the information associated with the object. If the processor determines that the object has a trajectory that would cause the object to collide with the interactive surface, the processor can generate a notification.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application claims priority to and is a continuation of U.S. patent application Ser. No. 15/040,045, filed on Feb. 10, 2016, which is a continuation of U.S. patent application Ser. No. 13/633,579, filed on Oct. 2, 2012, now U.S. Pat. No. 9,292,136 both of which are herein incorporated by reference in their entireties.
  • FIELD OF THE INVENTION
  • The present application relates to collision warning systems, and more particularly to a system and method for preventing a collision with an interactive surface.
  • BACKGROUND
  • In today's technologically-driven society, interactive glass surfaces and displays are at the forefront of innovation and, while not yet ubiquitous, will become increasingly popular with users in the coming years. This is particularly true considering the very rapid commercial acceptance of products that utilize touchscreen technologies such as touchscreen phones, tablets, computers, televisions, and displays. As a result, users will have many more opportunities to interact with such interactive glass surfaces or other similar surfaces. For example, network-connected computers can display computer interfaces on a glass table for users sitting at the table during a meeting; glass walls can have interfaces that users can interact with to perform a variety of functions; sliding glass doors in homes can have computer displays that users can interact with, and patient bedrooms in hospitals can have glass walls that doctors can use to view medical information.
  • As interactive glass surfaces become more widespread, the chances of users or objects colliding with such glass surfaces will increase substantially. Unfortunately, glass surfaces can be quite dangerous to those who do not see them. A user that collides with an interactive glass surface can incur significant injuries or even cause the glass to shatter or malfunction. For example, if an interactive glass surface is used in a hospital, two doctors discussing and walking at the same time may collide with a clear interactive glass display. Also, kids running around a household may accidentally run into interactive glass doors or other displays and sustain serious injuries. Additionally, users may accidentally place inappropriate materials on a computer display that may ultimately damage the display. Furthermore, when a person is walking in the dark in the middle of the night to get a glass of water, the person may walk right into an interactive glass display. Although the future of interactive displays is exciting, there are many unique nuances associated with the use of interactive displays.
  • SUMMARY
  • A system and accompanying methods for providing awareness of an interactive surface are disclosed. The system may be configured to generate various types of notifications or perform other actions if a person or an object has a trajectory that indicates that the person or object will collide with an interactive surface. For example, various types of notifications may include, but are not limited to, one or more of adjusting the opacity of the interactive surface, presenting different types of messages on the glass surface, emitting sounds and vibrations, and displaying various types of images or colors on the glass surface. Notably, the trajectory of the person or object may be determined based on determining position and velocity by using one or more cameras, sensors, location positioning systems, and other systems and devices that can capture and process information about the person or the object. If the person or object continues on a trajectory to collide with the interactive surface, the notifications may increase in intensity. On the other hand, if the person or object changes its trajectory such that the person or object is no longer on a path to collide with the interactive surface, the system may cause the notifications to decrease in intensity or remove the notifications altogether.
  • The system for preventing a collision with an interactive surface may include a memory for storing instructions and a processor that is communicatively coupled to the interactive surface. The processor may execute the instructions to perform operations comprising receiving media content associated with an object that is within a range of the interactive surface. Additionally, the processor may be configured to detect a position and a velocity of the object that is in the range of the interactive surface. The processor may detect the position and the velocity based at least in part on the recorded media content. Also, the processor may determine if the object has a trajectory that would cause the object to collide with the interactive surface. The trajectory may be determined based on the position and the velocity of the object. Furthermore, the processor may generate a notification if the object is determined to have a trajectory that would cause the object to collide with the interactive surface.
  • In another embodiment, a method for providing awareness of an interactive surface is provided. The method may include receiving media content associated with an object that is within a range of the interactive surface. Additionally, the method may include determining a position and a velocity of the object that is in the range of the interactive surface based at least in part on the media content. A processor may be configured to determine the position and the velocity of the object by executing instructions stored in a memory. The method may also include determining if the object has a trajectory that would cause the object to collide with the interactive surface. The trajectory of the object may be determined based on the position and the velocity of the object. Furthermore, the method may include generating a notification if the object is determined to have the trajectory that would cause the object to collide with the interactive surface.
  • According to another exemplary embodiment, a tangible computer-readable medium comprising instructions for providing awareness of an interactive surface may be provided. The computer instructions, when loaded and executed by a processor, may cause the processor to perform operations including the following: receiving media content associated with an object that is within a range of an interactive surface; determining a position and a velocity of the object that is in the range of the interactive surface based at least in part on the media content; determining if the object has a trajectory that would cause the object to collide with the interactive surface, wherein the trajectory is determined based on the position and the velocity of the object; and generating a notification if the object is determined to have the trajectory that would cause the object to collide with the interactive surface.
  • These and other features of the system and methods are described in the following detailed description, drawings, and appended claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A is a schematic diagram illustrating a person that is approaching an interactive surface of an awareness system according to the present disclosure.
  • FIG. 1B is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered various notifications from the interactive surface of FIG. 1.
  • FIG. 1C is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered more intense notifications from the interactive surface of FIG. 1.
  • FIG. 2A is a schematic diagram illustrating a person that is approaching an interactive surface of another embodiment of the awareness system.
  • FIG. 2B is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered a notification from the interactive surface of FIG. 4.
  • FIG. 2C is a schematic diagram illustrating a person that is in a position and is moving at a velocity that has triggered more intense notifications from the interactive surface of FIG. 4.
  • FIG. 3 is a flow diagram that illustrates a sample method for receiving various types of notifications based on the person's trajectory with respect to an interactive surface according to the present disclosure.
  • FIG. 4 is a flow diagram illustrating another sample method for receiving various types of notifications based on a person's trajectory with respect to an interactive surface.
  • FIG. 5 is a flow diagram illustrating another method for providing awareness of an interactive surface according to an embodiment of the present disclosure.
  • FIG. 6 is a diagrammatic representation of a machine in the form of a computer system within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed herein.
  • DETAILED DESCRIPTION OF THE INVENTION
  • A system 100 for providing awareness of an interactive surface 110 is disclosed in the present disclosure. The system 100 alerts a person to the presence of an interactive surface 110 to prevent a collision with the interactive surface 110. The system 100 for providing awareness of an interactive surface 110 may be configured to generate various types of notifications or perform other actions if a user or an object 105 has a trajectory that indicates that the user or object will collide with an interactive surface 110 or other surface. For example, various types of notifications may include, but are not limited to including, one or more of adjusting the opacity of the interactive surface 110, presenting different types of messages on or near the interactive surface 110, emitting sounds and vibrations from or around the interactive surface 110, or displaying various types of images or colors on or near the interactive surface 110.
  • The system 100 may determine the trajectory of the person or object 105 based on determining the person or object's position and velocity by using one or more image capture devices 120, sensors, location positioning systems, and other systems and devices that can retrieve or process information, or both, relating to the person or the object 105. If the person or object 105 continues on the trajectory to collide with the interactive surface 110, the system 100 may increase the intensity of the notifications or actions. However, if the person or object 105 adjusts its trajectory so that the person or object is no longer on a path to collide with the interactive surface 110, the system 100 may decrease the intensity of the notifications or even remove the notifications altogether. Also, the system 100 may utilize sensors to determine one or more of air movement, sounds, vibrations or other detectable information associated with the person or the object 105, and may generate notifications based on such information.
  • Although FIGS. 1-6 illustrate specific example configurations of the various components of the system 100, the system 100 may include any configuration of the components, which may include using a greater or lesser number of the components. For example, the system 100 may include multiple servers 150, multiple location determination devices 115, and multiple image capture devices 120. As another example, the system 100 may be configured to not include the air movement sensor 125, but may include the sound sensor 130 and the vibration sensor 135.
  • Object 105 in FIGS. 1A-2C is shown as being a person that is approaching the interactive surface 110. However, the object 105 is not limited to being a person, but may also be any animal or thing that could potentially collide with the interactive surface 110. Additionally, the object 105 may be anything that may be sensed by the various components of the system 100.
  • The interactive surface 110 may be any device that a user or potentially an animal could interact with and may be configured to display a wide range of notifications to users. As shown in FIGS. 1A-2C, the interactive surface 110 may be an interactive glass display, a computer, a mobile device, a tablet device, a wall display, an interactive table display, any interactive monitor or television display, any touchscreen device having a display, a plasma display, a liquid crystal display, a 3-dimensional (3-D), or any other device that has a display. In one embodiment, the interactive surface 110 may be an electrochromic device, a suspended particle device, a polymer dispersed liquid crystal device, mechanical smart windows, or a device having the capability of changing its opacity, color, visual appearance, or temperature, or a combination thereof.
  • The interactive surface 110 may include or otherwise be communicatively linked to one or more electronic processors 112 that may be configured to perform a variety of operations in the system 100. The electronic processors 112 may perform the operations in the system 100 or may direct various components in communication with the interactive surface 110 or other components of the system 100 to perform the operations, or both. For example, the electronic processors 112 may direct the electronic processors 116 of the location determination device 115 to determine position and velocity information for the object 105. As another example, the electronic processors 112 may direct the image capture device 120 to capture media content associated with the object 105. Notably, the interactive surface 110 may be configured to display colors, adjust its opacity, display images and messages, display user-interactive programs, emit sounds of various pitches, volumes, and timbres, emit vibrations, adjust its temperature, receive input from users, display outputs, or perform any computer operations by utilizing the electronic processor 112 or other means. Additionally, the interactive surface 110 may be communicatively linked to one or more of the location determination device 115, the image capture device 120, the air movement sensor 125, the sound sensor 130, the vibration sensor 135, the ultrasound device 137, the mobile device 140, the communications network 145, the server 150, and the database 155.
  • The location determination device 115 may be any device that may be utilized to determine a position or a velocity, or both, of the object 105, which, in FIGS. 1-6, is a person. The location determination device 115 may include or otherwise be communicatively linked to one or more electronic processors 116 that may be configured to perform a variety of operations. The electronic processors 116 may perform the operations in the system 100 or may direct various components in communication with location determination device 116 or other components of the system 100 to perform the operations, or both. The electronic processors 116 may be software, hardware, or a combination of hardware and software. The location determination device 115 may include a global positioning system, a device that determines location by triangulation, or any device that can determine the position or velocity of the object 105, or both. Notably, the location determination device 115 may be configured to determine a current position or a current velocity of the object 105, or both, with respect to the position of the interactive surface 110. Furthermore, the location determination device 115 may be communicatively linked with any of the components in the system 100 and can transmit data, such as the position or velocity data, or both together, with any other necessary information, to any of the components in the system 100. Although the location determination device 115 is illustratively shown on the interactive surface 110, the location determination device 115 may be placed anywhere that the location determination device 115 can effectively capture information associated with the object 105. In one embodiment, the location determination device 115 may be located in close proximity to or even within the interactive surface 110 itself.
  • The image capture device 120 of the system 100 may be a camera, such as, but not limited to, a video camera or other type of surveillance device that may be utilized to capture and record media content associated with the object 105. The media content can include visual content, audio content, content associated with vibrations that the object 105 makes, and other content. For example, if the object 105 is a person, as shown in FIGS. 1A-2C, the image capture device 120 can record video of the person and any sounds that the person makes when the person is in range of the interactive surface 110 or the system 100. The image capture device 120 may record sounds by utilizing a microphone, which may reside within the interactive surface 110 or in proximity to the interactive surface 110.
  • In one embodiment, the image capture device 120 may be configured to determine the position, size, and the velocity of the object 150 based on the media content that the image capture device 120 records. In another embodiment, the image capture device 120 may be able to generate a 3-D image that can be utilized to show where and how fast the object 105 is moving with respect to the interactive surface 110. Furthermore, the image capture device 120 may be communicatively linked with any of the components in the system 100. Although the image capture device 120 is illustratively shown above the interactive surface 110, the image capture device 120 may be placed anywhere that the image capture device 120 can effectively capture media content associated with the object 105. In an embodiment, the image capture device 120 may be a component of the interactive surface 110 itself, and may be positioned within the interactive surface 110 or outside of the interactive surface 110.
  • The air movement sensor 125 of the system 100 may be configured to detect changes in air movement, which may be generated by the object 105, that are in the general vicinity of the interactive surface 110. For example, if a person is running towards the interactive surface 110, the air movement sensor 125 can detect changes in air movement caused by the person's running. Additionally, the sound sensor 130 may be configured to detect changes in sound in the general area of the interactive surface 110 that may be caused by the object 105. For example, if a person is making sounds near the interactive surface 110, the sound sensor 103 can detect the sounds that the person makes. Furthermore, the vibration sensor 135 may be configured to detect changes in vibrations in the general area of the interactive surface 110 that may be caused by the object 105. For example, the vibrations that a person makes while walking or running towards the interactive surface 110 may be detected by the vibration sensor 135 and may be used by the system 100 to determine where the person is and how fast the person is moving.
  • One or more of the air movement sensor 125, the sound sensor 130, and the vibration sensor 135 may be configured to transmit detected information to any of the components of the system 100 and may be configured to include electronic processors. Also, the information detected by the air movement sensor 125, the sound sensor 130, and the vibration sensor 135 may be utilized in determining the position and velocity of the object 105. Although the air movement sensor 125, the sound sensor 130, and the vibration sensor 135 are illustratively shown on the interactive surface 110, the sensors may be placed anywhere that the sensors can effectively detect the information about the object 105.
  • In one embodiment, the system 100 may also include an ultrasound device 137, which can be configured to emit an ultrasonic sound in the direction of the object 105. Once the ultrasonic sound is emitted, an echo may be returned from the object 105 based on the ultrasonic sound. The echo may be utilized by the system 100 to determine various information about the object 105 such as, but not limited to, the size, shape, position, and velocity of the object 105. Additionally, the echo may be utilized by the system 100 to generate an image, such as a 3-D image of the object 105 so that the interactive surface 110 can determine what is approaching it and display any necessary warning notifications.
  • The mobile device 140 may be cellular telephone, a tablet, a personal computer, or any other similar device that may be utilized by a person that is approaching the interactive surface 110. Notably, the mobile device 140 may be communicatively linked with any of the components in the system 100 and may transmit position or velocity data, or both, associated with the object 105 to the interactive surface 110 or other components in the system 100. For example, if the mobile device 140 is a cellular phone with a global positioning system (GPS), then position information, identification information, velocity information, or other types of information associated with the mobile device 140 or the person 105, or both, may be transmitted to the system 100 by the GPS. Additionally, the mobile device 140 may utilize triangulation or other location determination mechanisms to determine the position and velocity of the objection 105. The information transmitted to the system 100 may be utilized in determining a trajectory of the object 105 with respect to the interactive surface 110.
  • The communications network 145 may be any other suitable network that may be utilized to allow the various components of the system 100 to communicate with one another. For instance, the communications network 145 may be a wireless network, an ethernet network, a satellite network, a broadband network, a cellular network, a private network, a cable network, the Internet, or any other network. The server 150 may include an electronic processor and may be configured to handle any necessary processing for carrying out the various operative functions of the system 100. For example, the server 150 may receive data from the interactive surface 110, the location determination device 115, the image capture device 120, the air movement sensor 125, the sound sensor 130, the vibration sensor 135, the mobile device 140, the communications network 145, the ultrasound device 137, and the database 155 to determine the precise location of the object 105. In one embodiment, any of the electronic processors disclosed herein may perform the operations in the system 100 or may direct various components in communication with the components of the system 100 to perform the operations, or both. Additionally, the server 150 may be configured to transmit signals to the interactive surface 110 so that the interactive surface 110 can display notifications. As an example, if the system 100 determines that a person is about to collide with the interactive surface 110, the server 150 may send a signal to the interactive surface to display a message on the interactive surface 110 that can warn the person of the impending collision. Furthermore, any and all data that traverses the system 100 may be stored in the database 155.
  • Operatively, the system 100 may be configured to provide awareness of the interactive surface 110 in a variety of ways. In a first example scenario, which is illustrated in FIGS. 1A-1C, an object 105 (in this case a person) is approaching the interactive surface 110. In this example, the interactive surface 110 is a large interactive glass display. In FIG. 1A, for example, the person is illustratively shown at a location far away from the interactive surface 110. One or more of the location determination device 115, the image capture device 120, the various sensors 125, 130, and 135, the ultrasound device 137, the mobile device 140, the communications network 145, the server 150, and the database 155, can all work in concert with one another to determine the person's position and velocity with respect to the interactive surface 110. Notably, not all of these devices are required to determine the position and velocity of the person, but rather any subset of the devices in the system 100 may be utilized to determine the person's position and velocity.
  • Once the position and velocity of the person are determined, the system 100, via server 150 or any other device in the system 100, can determine a trajectory of the person with respect to the interactive surface 110. If the trajectory indicates that a collision will occur, the system 100 may send a signal to the interactive surface 110 to display or emit a notification that can be perceived by the person. The notifications may include, but are not limited to including, a displayed image or message, a change in color of the interactive surface 110, emitted sounds and vibrations, or a change in the opacity of the interactive surface 110. In FIG. 1A, the person is far enough away from the interactive surface 110 that the system 100 does not cause the interactive surface 110 to generate a notification and the interactive surface 110 can remain in a transparent state.
  • In FIG. 1B, however, the person is shown at a location that is closer to the interactive surface 110 than the person is in FIG. 1A. The various devices in the system 100 may determine that the person is located at a position and is moving at a velocity such that the person may eventually collide with the interface surface 110. If the person is determined to be moving on a trajectory to eventually collide with the interface surface 110 based on his or her position or velocity, then system 100 may generate a notification and transmit the notification to the interface surface 110. In one embodiment, the notification may be a message such as “Warning! Nearing Glass Surface!,” or another message that can notify the person of the presence of the interface surface 110. Additionally, the system 100 can cause the interface surface 110 to emit sounds and vibrations to notify the person of the presence of the interface surface 110 as well. At this point, the person may either change his or her trajectory based on the notifications displayed or emitted by the interface surface 110, or may continue to proceed on the trajectory towards the interface surface 110. If the system determines that the person stops moving, or moves in an alternate trajectory that would not cause the person to collide with the interface surface 110, the system 100 may remove any displayed notifications or cause any emitted warning sounds and vibrations to stop.
  • However, if the person continues to move on the trajectory to collide with the interface surface 110 as illustrated in FIG. 1C, then system 100 may transmit a signal to the interface surface 110 to display a more intense message such as “Warning! You Are About To Collide With This Glass Surface!,” or another message that can urgently notify the person of the presence of the interface surface 110. Additionally, as shown in FIG. 1C, the system 100 can cause the interface surface 110 to emit sounds and vibrations of increased intensity to notify the person of the presence of the interface surface 110. Hopefully, after sensing or seeing the notifications of increased intensity, the person will change their trajectory to avoid a collision with the interface surface 110. If, however, the person was merely rushing to use the interface surface 110, and the person eventually slows down to a point where the system 100 no longer considers the person to be on a trajectory to collide with the interface surface 110, the interface surface 110 can either reduce the intensity of the notifications or remove the notifications altogether. In one embodiment, the interface surface 110 can display a program that the person may want to use or display instructions to teach the user how to use the interface surface 110.
  • In another example scenario, which is illustrated in FIGS. 2A-2C, an object 105, which is shown as being a person, is approaching the interactive surface 110, which in this case is a large interactive glass display. In FIG. 2A, the person is shown at a location far from the interactive surface 110. As in the previous example, the various components of the system 100 can all work in concert together to determine the person's position and velocity with respect to the interactive surface 110. However, as mentioned above, not all of the devices in the system 100 are required to determine the position and velocity of the person, but rather any subset of the devices in the system 100 may be utilized to determine the person's position and velocity. Once the position and velocity of the person are determined, the system 100, via server 150 or any other device in the system 100, can determine the trajectory of the person with respect to the interactive surface 110. If the trajectory indicates that a collision will occur, the system 100 may send a signal to the interactive surface 110 to display or emit a notification that can be perceived by the person. In FIG. 2A, the person has been determined to be far enough away from the interactive surface 110 that the system 100 does not cause the interactive surface 110 to generate a notification. As a result, the interactive surface 110 remains in a transparent state.
  • In FIG. 2B, however, the person is shown at a location that is closer to the interactive surface 110 than the person is in FIG. 2A and closer than a threshold distance. When the person proceeds closer than the threshold distance, the system 100 may determine that the person has a position and velocity such that the person may collide with the interface surface 110. If the person is determined to be moving on a trajectory to eventually collide with the interface surface 110 based on his or her position or velocity, then system 100 may transmit a signal to the interface surface 110 to adjust its opacity or color so that the system 100 that can alert the person of the presence of the interface surface 110. As shown in FIG. 2B, the interface surface 110 is more opaque than the interface surface 110 illustrated in FIG. 2A. Alternatively or additionally, the system 100 can cause the interface surface 110 to emit sounds or vibrations, or both, to notify the person of the location of the interface surface 110. Upon sensing the notifications from the system 100, the person may either change his or her trajectory or may continue to proceed on the trajectory towards the interface surface 110. If the system determines that the person is no longer on a trajectory to collide with the interface surface 110, the system 100 may cause the interface surface 110 to remove or reduce any notifications. However, if the person continues to move on the trajectory to collide with the interface surface 110 as illustrated in FIG. 2C, then the system 100 may transmit a signal to the interface surface 110 to increase the opacity of the interface surface 110 to further notify the person of the presence of the interface surface 110.
  • In one embodiment, the interface surface 110 may be given a signal by the system 100 to display a darker or brighter color on the interactive surface 110. For instance, as shown in FIG. 2C, the interface surface 110 is more opaque than the interface surface 110 illustrated in FIG. 2B. Additionally, the system 100 can cause the interface surface 110 to emit sounds or vibrations, or both, of increased intensity to notify the person of the interface surface 110. Hopefully, after noticing the notifications of increased intensity, the person will change their trajectory to avoid colliding with the interface surface 110. However, if the person was merely rushing to use the interface surface 110, and the person eventually slows down to a point where the system 100 determines that the person is not on a trajectory to collide with the interface surface 110, the interface surface 110 can either reduce the intensity of the notifications or remove the notifications. In one embodiment, the interface surface 110 can display a program that the person may want to use or even display instructions that can teach the user how to use the interface surface 110.
  • An exemplary method 300 for providing awareness of an interactive surface 110 as shown in FIG. 3 involves a situation where a child is running around a family home. The method 300 may include, at step 305, having a child running around his family's house. While the child is running around the house, the system 100 may determine, at step 310, whether the child is in the range of the interactive surface 110, which in this case may be an interactive glass display. At step 315, the system 100 may determine that the child is on a trajectory to collide with the interactive surface 110. Once the system 100 determines that the child is on the trajectory to collide with the interactive surface 110, the system 100 can transmit a signal to the interactive surface 110 to cause the interactive surface 110 to increase its opacity. Alternatively or additionally to sending a signal to increase opacity, the system 100 can send a signal to the interactive surface 110 to generate and vibrate a low-pitch tone at step 325. At step 330 of the method 300, the child can see or hear, or both, the interactive surface 110 based on the increased opacity and the low-pitch tone. Once the child sees or hears, or both, the interactive surface 110, the child can change his trajectory so that he will no longer be on course to collide with the interactive surface 110 at step 335. At step 340, the system 100 can determine that the child is on a new trajectory that would not cause a collision with the interactive surface 110. At step 345, the system 100 can send signals to the interactive surface 110 become transparent once again and to stop vibrating the tone.
  • Another exemplary method 400 for providing awareness of an interactive surface 110 (in this case a glass display), as shown in FIG. 4, involves a situation where a child does not take the first warning from the interactive surface 110. The method 400 may include, at step 405, having the child running around his family's house as in the previous example. While the child is running around the house, the system 100 may determine, at step 410, the child's movement information, such as his position, velocity, and trajectory. At step 415, the system 100 may determine that the child is getting too close to the interactive surface 110. As a result, the method 400 may include having the system 100 send a signal to the interactive surface 110 to increase its opacity such that the interactive surface 110 is 20% white at step 420. At step 425, the system 100 can cause the interface surface 110 to emit a 700 hertz (Hz) vibration.
  • At step 430, the system 100 can determine that, despite the initial notifications, the child is getting even closer to the interactive surface 110. Now, the system 100 can send another signal to the interactive surface 110 to increase its opacity such that the interactive surface 110 is now 40% white at step 435. Additionally, at step 440, the system 100 can cause the interface surface 110 to adjust the 700 Hz vibration to a 1400 Hz vibration. At this point, the child may see or hear, or both, the interactive surface 110 at step 445. At step 450, the system 100 can determine that a collision with the interactive surface 110 has been avoided because the child has changed his trajectory. At step 455, the child can run elsewhere and the system 100 can determine the child's new position, velocity, and trajectory and find that the child is no longer at risk of a collision. At step 460, the system 100 can transmit a signal to the interactive surface 110 to return to a clear or transparent state. Finally, the system 100 can transmit a signal to the interactive surface 110 to stop emitting vibrations at step 465.
  • In yet another an exemplary method 500 for providing awareness of an interactive surface 110, as shown in FIG. 5, the method 500 may include, at step 502, recording media content associated with an object 105 that is in the range of an interactive surface 110. The media content can be sound content, visual content, or vibrations, any combination thereof, or any other content. The media content may be recorded, for example, by the image capture device 120 or other appropriate device. At step 504, the method 500 may include determining a position and a velocity of the object 105 based on the media content or on any of the information gathered by any of the components in the system 100, or any combination thereof. The position and the velocity, for example, may be determined by the location determination device 115, the interactive surface 110, the server 150, the computer system 600, any combination thereof, or other appropriate device. The method 500 may include, at step 506, determining if the object 105 has a trajectory that would cause the object to collide with the interactive surface 110 based on the position and velocity of the objection 105. The trajectory may also be determined based on sensed vibrations, sounds, images, or other information gathered by the components of the system 100. In one embodiment, the trajectory may be determined by the location determination device 115, the interactive surface 110, the server 150, the computer system 600, any combination thereof, or other appropriate device.
  • At step 508, if the object 105 is determined not to be on a trajectory to collide with the interactive surface 110, then the method can return to step 502 or the method can conclude. However, if the object 105 is determined to be on a trajectory to collide with the interactive surface 110, the method 500 may include generating a notification on or near the interactive surface 110 to indicate the location of the interactive service 110 at step 510 via the location determination device 115, the interactive surface 110, the server 150, the computer system 600, any combination thereof, or other appropriate device. At step 512, the method 500 may include determining if the object 105 is still on the trajectory to collide with the interactive surface 110 despite the notifications. Step 512 may be performed by the location determination device 115, the interactive surface 110, the server 150, the computer system 600, any combination thereof, or other appropriate device. If the object 105 is no longer on course to collide with the interactive surface after the initial notifications, the method 500 may include reducing or removing the notifications from or around the interface surface 110 at step 514. However, if the object 105 is still on course for a collision with the interface surface 110, then the method 500 may include increasing the intensity or number of notifications, or both, so that the object 105 hopefully changes its trajectory at step 516. In one embodiment, steps 514 and 516 may be performed by the interactive surface 110, the server 150, the computer system 600, any combination thereof, or other appropriate device.
  • In one embodiment, the system 100 and methods described herein may include presenting the notifications only on a portion of the interactive surface 110. For instance, while notifications are presented on some portions, other portions of the interface surface 110 can remain in normal operational mode. In another embodiment, the systems and methods may include enabling a person to use the interactive surface 110 as a whiteboard. For example, if the system 100 causes the interactive surface 110 to become opaque as a notification to a person approaching the interactive surface 110, the opaque portion may be used as a whiteboard by the person. In an embodiment, the trajectory calculated by the system 100 may be a predicted path that the person or object 105 may be determined to be following.
  • In another embodiment, the system 100 and methods may include providing a specific sequence or pattern of notifications based on the situation that presents itself. For example, the system 100 may cause the interactive surface 110 to emit a specific low sound, multi-pulse sequence that can indicate when the interactive surface 110 can be utilized as a computer display, whiteboard, or for certain other purposes. In another embodiment, the system 100 and methods may include having the interactive surface 110 visually or orally identify its functions to users and any data to which it has access. In another embodiment, the system 100 and methods can utilize infrared technology, motion detectors, laser detectors, or any other type of detector to assist in determine a position, velocity, or other information associated with an object 105. In another embodiment, the method 500 may include determining the positions and velocities of multiple objects 105. Multiple trajectories may be determined either for a single object 104 or for multiple objects 105 and notifications may be generated so that they are tailored to each object 105 that is approaching the interactive interface. It is important to note that the methods described above may incorporate any of the functionality, devices, or features of the systems described above, or otherwise, and are not intended to be limited to the description or examples provided herein.
  • The foregoing is provided for purposes of illustrating, explaining, and describing embodiments of this invention. Modifications and adaptations to these embodiments will be apparent to those skilled in the art and may be made without departing from the scope or spirit of this invention. Upon reviewing the aforementioned embodiments, it would be evident to an artisan with ordinary skill in the art that said embodiments can be modified, reduced, or enhanced without departing from the scope and spirit of the claims described below.
  • Referring now also to FIG. 6, at least a portion of the methodologies and techniques described with respect to the exemplary embodiments can incorporate a machine, such as, but not limited to, computer system 600, or other computing device within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies or functions discussed above. The machine may be configured to facilitate various operations conducted by the system 100. For example, the machine may be configured to, but is not limited to, assist the system 100 by providing processing power to assist with processing loads experienced in the system 100, by providing storage capacity for storing instructions or data traversing the system 100, by capturing media content, or by assisting with any other operations conducted by or within the system 100.
  • In some embodiments, the machine operates as a standalone device. In some embodiments, the machine may be connected (e.g., using a network 145) to and assist with operations performed by other machines, such as, but not limited to, the interactive surface 110, the location determination device 115, the image capture device 120, the air movement sensor 125, the sound sensor 130, the vibration sensor 135, the ultrasound device 137, the mobile device 140, the server 150, and the database 155, or any combination thereof. The machine may be connected with any component in the system 100. In a networked deployment, the machine may operate in the capacity of a server or a client user machine in server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may comprise a server computer, a client user computer, a personal computer (PC), a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • The computer system 600 may include a processor 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU, or both), a main memory 604 and a static memory 604, which communicate with each other via a bus 608. The computer system 600 may further include a video display unit 610 (e.g., a liquid crystal display (LCD), a flat panel, a solid state display, or a cathode ray tube (CRT)). The computer system 600 may include an input device 612 (e.g., a keyboard), a cursor control device 614 (e.g., a mouse), a disk drive unit 616, a signal generation device 618 (e.g., a speaker or remote control) and a network interface device 620.
  • The disk drive unit 616 may include a machine-readable medium 622 on which is stored one or more sets of instructions 624 (e.g., software) embodying any one or more of the methodologies or functions described herein, including those methods illustrated above. The instructions 624 may also reside, completely or at least partially, within the main memory 604, the static memory 606, or within the processor 602, or a combination thereof, during execution thereof by the computer system 600. The main memory 604 and the processor 602 also may constitute machine-readable media.
  • Dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices can likewise be constructed to implement the methods described herein. Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems. Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the example system is applicable to software, firmware, and hardware implementations.
  • In accordance with various embodiments of the present disclosure, the methods described herein are intended for operation as software programs running on a computer processor. Furthermore, software implementations can include, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
  • The present disclosure contemplates a machine readable medium containing instructions 624 so that a device connected to the communications network 145 can send or receive voice, video or data, and to communicate over the network 145 using the instructions 624. The instructions 624 may further be transmitted or received over the network 145 via the network interface device 620.
  • While the machine-readable medium 622 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • The term “machine-readable medium” shall accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
  • The illustrations of arrangements described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. Many other arrangements will be apparent to those of skill in the art upon reviewing the above description. Other arrangements may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Figures are also merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
  • Thus, although specific arrangements have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific arrangement shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments and arrangements of the invention. Combinations of the above arrangements, and other arrangements not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description. Therefore, it is intended that the disclosure not be limited to the particular arrangement(s) disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments and arrangements falling within the scope of the appended claims.

Claims (20)

We claim:
1. A system, comprising:
a memory that stores instructions;
a processor that executes the instructions to perform operations, the operations comprising:
determining, based on a position and a velocity of an object, if the object has a trajectory that would cause the object to collide with a user electronic interactive glass surface, wherein the determining is also based on an output from a sensor, wherein the output from the sensor includes image data associated with the object; and
generating a notification if the object is determined to have the trajectory that would cause the object to collide with the user electronic interactive glass surface, wherein the notification comprises adjusting an opacity of the user electronic interactive glass surface.
2. The system of claim 1, wherein the operations further comprise receiving the output from the sensor.
3. The system of claim 1, wherein the operations further comprise determining, based on a shape of the object, if the object has the trajectory that would cause the object to collide with the user electronic interactive glass surface.
4. The system of claim 1, wherein the operations further comprise detecting the position and the velocity of the object based on the output from the sensor.
5. The system of claim 1, wherein the operations further comprise presenting the notification on a portion of the user electronic interface glass surface.
6. The system of claim 1, wherein the operations further comprise enabling the user electronic interface glass surface to be utilized as a whiteboard.
7. The system of claim 1, wherein the operations further comprise removing the notification if the object is determined to no longer have the trajectory that would cause the object to collide with the user electronic interactive glass surface.
8. The system of claim 1, wherein the operations further comprise transmitting a signal to cause the user electronic interactive glass surface to visually identify a function on the user electronic interactive glass surface.
9. The system of claim 1, wherein the operations further comprise determining if the object makes a sound.
10. The system of claim 9, wherein the operations further comprise generating the notification if the sound is greater than a threshold value.
11. The system of claim 1, wherein the operations further comprise recording media content associated with the object.
12. The system of claim 1, wherein the operations further comprise generating a sequence of notifications.
13. The system of claim 1, wherein the operations further comprise generating an image of the object based on the output.
14. A method, comprising:
determining, based on a position and a velocity of an object, if the object has a trajectory that would cause the object to collide with a user electronic interactive glass surface, wherein the determining is also based on an output from a sensor, wherein the output from the sensor includes image data associated with the object, wherein the trajectory is determined by utilizing instructions from a memory that are executed by a processor; and
generating, by utilizing the instructions from the memory that are executed by the processor, a notification if the object is determined to have the trajectory that would cause the object to collide with the user electronic interactive glass surface, wherein the notification comprises adjusting an opacity of the user electronic interactive glass surface.
15. The method of claim 14, further comprising determining, based on a shape of the object, if the object has the trajectory that would cause the object to collide with the user electronic interactive glass surface.
16. The method of claim 14, further comprising determining, based on a size of the object, if the object has the trajectory that would cause the object to collide with the user electronic interactive glass surface.
17. The method of claim 14, further comprising determining, based on an echo emitted by the object, if the object has the trajectory that would cause the object to collide with the user electronic interactive glass surface.
18. The method of claim 14, further comprising increasing an intensity of the notification as the object gets closer to the user electronic interactive glass surface.
19. The method of claim 14, further comprising decreasing an intensity of the notification as the object gets farther away from the user electronic interactive glass surface.
20. A non-transitory computer-readable medium comprising instructions, which when executed by a processor, cause the processor to perform operations comprising:
determining, based on a position and a velocity of an object, if the object has a trajectory that would cause the object to collide with a user electronic interactive glass surface, wherein the determining is also based on an output from a sensor, wherein the output from the sensor includes image data associated with the object; and
providing, by utilizing the instructions from the memory that are executed by the processor, a notification if the object is determined to have the trajectory that would cause the object to collide with the user electronic interactive glass surface, wherein the notification comprises adjusting an opacity of the user electronic interactive glass surface.
US15/412,183 2012-10-02 2017-01-23 Notification system for providing awareness of an interactive surface Active US9911298B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/412,183 US9911298B2 (en) 2012-10-02 2017-01-23 Notification system for providing awareness of an interactive surface

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/633,579 US9292136B2 (en) 2012-10-02 2012-10-02 Notification system for providing awareness of an interactive surface
US15/040,045 US9552713B2 (en) 2012-10-02 2016-02-10 Notification system for providing awareness of an interactive surface
US15/412,183 US9911298B2 (en) 2012-10-02 2017-01-23 Notification system for providing awareness of an interactive surface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/040,045 Continuation US9552713B2 (en) 2012-10-02 2016-02-10 Notification system for providing awareness of an interactive surface

Publications (2)

Publication Number Publication Date
US20170132896A1 true US20170132896A1 (en) 2017-05-11
US9911298B2 US9911298B2 (en) 2018-03-06

Family

ID=50384619

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/633,579 Active 2033-04-12 US9292136B2 (en) 2012-10-02 2012-10-02 Notification system for providing awareness of an interactive surface
US15/040,045 Active US9552713B2 (en) 2012-10-02 2016-02-10 Notification system for providing awareness of an interactive surface
US15/412,183 Active US9911298B2 (en) 2012-10-02 2017-01-23 Notification system for providing awareness of an interactive surface

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/633,579 Active 2033-04-12 US9292136B2 (en) 2012-10-02 2012-10-02 Notification system for providing awareness of an interactive surface
US15/040,045 Active US9552713B2 (en) 2012-10-02 2016-02-10 Notification system for providing awareness of an interactive surface

Country Status (1)

Country Link
US (3) US9292136B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170024987A1 (en) * 2015-07-22 2017-01-26 Che Wei Lin Device and system for security monitoring
CN107170196A (en) * 2017-07-21 2017-09-15 京东方科技集团股份有限公司 Anticollision device, collision-prevention device, CAS and avoiding collision

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9895548B2 (en) 2013-01-23 2018-02-20 West Affum Holdings Corp. Wearable cardiac defibrillator (WCD) system controlling conductive fluid deployment per impedance settling at terminal value
US10543377B2 (en) 2013-02-25 2020-01-28 West Affum Holdings Corp. Wearable cardioverter defibrillator (WCD) system making shock/no shock determinations by aggregating aspects of patient parameters
US10500403B2 (en) 2013-02-25 2019-12-10 West Affum Holdings Corp. WCD system validating detected cardiac arrhythmias thoroughly so as to not sound loudly due to some quickly self-terminating cardiac arrhythmias
US9757579B2 (en) 2013-02-25 2017-09-12 West Affum Holdings Corp. Wearable cardioverter defibrillator (WCD) system informing patient that it is validating just-detected cardiac arrhythmia
US9827431B2 (en) 2013-04-02 2017-11-28 West Affum Holdings Corp. Wearable defibrillator with no long-term ECG monitoring
US10016613B2 (en) 2013-04-02 2018-07-10 West Affum Holdings Corp. Wearable cardiac defibrillator system long-term monitoring alternating patient parameters other than ECG
CN104121516B (en) * 2014-07-11 2016-08-17 京东方科技集团股份有限公司 Lamp holder and desk lamp for desk lamp
US20160253891A1 (en) * 2015-02-27 2016-09-01 Elwha Llc Device that determines that a subject may contact a sensed object and that warns of the potential contact
US9521365B2 (en) 2015-04-02 2016-12-13 At&T Intellectual Property I, L.P. Image-based techniques for audio content
US10619397B2 (en) * 2015-09-14 2020-04-14 Rytec Corporation System and method for safety management in roll-up doors
WO2017106846A2 (en) * 2015-12-18 2017-06-22 Iris Automation, Inc. Real-time visual situational awareness system
US10291742B2 (en) * 2016-07-01 2019-05-14 Google Llc Damage sensors for a mobile computing device
EP3899186A4 (en) 2018-12-21 2022-10-05 Rytec Corporation Safety system and method for overhead roll-up doors
CN114511982B (en) * 2022-04-19 2022-07-08 亿慧云智能科技(深圳)股份有限公司 Smoke alarm method and intelligent smoke alarm
CN116189380A (en) * 2022-12-26 2023-05-30 湖北工业大学 Man-machine safety interaction method, system, device and medium for mechanical equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6161071A (en) * 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture
US6650242B2 (en) * 2001-05-25 2003-11-18 Embridge Lake Pty Ltd Mobile plant proximity detection and warning system
US20050137786A1 (en) * 1997-10-22 2005-06-23 Intelligent Technologies International Inc. Communication method and arrangement
US7027808B2 (en) * 2002-05-21 2006-04-11 Philip Bernard Wesby System and method for monitoring and control of wireless modules linked to assets
US7796081B2 (en) * 1997-10-22 2010-09-14 Intelligent Technologies International, Inc. Combined imaging and distance monitoring for vehicular applications
US20110018697A1 (en) * 2009-07-22 2011-01-27 Immersion Corporation Interactive Touch Screen Gaming Metaphors With Haptic Feedback
US20120105312A1 (en) * 2010-10-29 2012-05-03 Microsoft Corporation User Input Device
US20120265505A1 (en) * 2009-07-02 2012-10-18 Thales Method for simulating behavior in a reconfigurable infrastructure and system implementing said method
US20130058623A1 (en) * 2011-09-07 2013-03-07 Vesstech, Inc. Video warning systems for devices, products, containers and other items

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6384742B1 (en) * 1994-06-08 2002-05-07 Michael A. Harrison Pedestrian crosswalk signal apparatus—pedestrian crosswalk
WO2001003016A1 (en) 1999-06-30 2001-01-11 Silverbrook Research Pty Ltd Method and system for sensing device registration
US6828918B2 (en) * 2000-11-29 2004-12-07 International Business Machines Corporation Personalized accessibility identification receiver/transmitter and method for providing assistance
JP3957505B2 (en) * 2001-12-26 2007-08-15 株式会社ワコム 3D information detection device, 3D information sensor device
ES2391556T3 (en) 2002-05-03 2012-11-27 Donnelly Corporation Object detection system for vehicles
US7978184B2 (en) 2002-11-08 2011-07-12 American Greetings Corporation Interactive window display
US6980125B1 (en) * 2003-04-09 2005-12-27 John Barber Warning light system for alerting pedestrians and passenger vehicle operators of an approaching emergency vehicle
US7548833B2 (en) * 2004-03-25 2009-06-16 Siemens Building Technologies, Inc. Method and apparatus for graphical display of a condition in a building system with a mobile display unit
US7046128B2 (en) 2004-05-26 2006-05-16 Roberts Kristie L Collision detection and warning system for automobiles
US8590087B2 (en) * 2004-12-14 2013-11-26 Rite-Hite Holding Corporation Lighting and signaling systems for loading docks
US7327253B2 (en) 2005-05-04 2008-02-05 Squire Communications Inc. Intruder detection and warning system
US7646886B2 (en) * 2005-05-11 2010-01-12 Lockheed Martin Corporation Closely-spaced multiple targets detection using a regional window as a discriminant function
JP4918981B2 (en) * 2005-11-04 2012-04-18 株式会社デンソー Vehicle collision determination device
US20080022596A1 (en) * 2006-07-27 2008-01-31 Boerger James C Door signaling system
US20080043099A1 (en) * 2006-08-10 2008-02-21 Mobileye Technologies Ltd. Symmetric filter patterns for enhanced performance of single and concurrent driver assistance applications
GB2452644B (en) 2006-10-10 2009-09-16 Promethean Ltd Automatic tool docking
JP4322913B2 (en) * 2006-12-19 2009-09-02 富士通テン株式会社 Image recognition apparatus, image recognition method, and electronic control apparatus
US8125348B2 (en) * 2007-06-29 2012-02-28 Verizon Patent And Licensing Inc. Automobile beacon, system and associated method
US8456293B1 (en) * 2007-10-22 2013-06-04 Alarm.Com Incorporated Providing electronic content based on sensor data
US20090187300A1 (en) * 2008-01-22 2009-07-23 David Wayne Everitt Integrated vehicle computer system
US8538171B2 (en) * 2008-03-28 2013-09-17 Honeywell International Inc. Method and system for object detection in images utilizing adaptive scanning
ATE527620T1 (en) * 2009-02-17 2011-10-15 Autoliv Dev METHOD AND SYSTEM FOR AUTOMATIC DETECTION OF OBJECTS IN FRONT OF A MOTOR VEHICLE
US8019390B2 (en) * 2009-06-17 2011-09-13 Pradeep Sindhu Statically oriented on-screen transluscent keyboard
US9104238B2 (en) * 2010-02-12 2015-08-11 Broadcom Corporation Systems and methods for providing enhanced motion detection
US8031085B1 (en) * 2010-04-15 2011-10-04 Deere & Company Context-based sound generation

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050137786A1 (en) * 1997-10-22 2005-06-23 Intelligent Technologies International Inc. Communication method and arrangement
US7796081B2 (en) * 1997-10-22 2010-09-14 Intelligent Technologies International, Inc. Combined imaging and distance monitoring for vehicular applications
US6161071A (en) * 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture
US6650242B2 (en) * 2001-05-25 2003-11-18 Embridge Lake Pty Ltd Mobile plant proximity detection and warning system
US7027808B2 (en) * 2002-05-21 2006-04-11 Philip Bernard Wesby System and method for monitoring and control of wireless modules linked to assets
US20120265505A1 (en) * 2009-07-02 2012-10-18 Thales Method for simulating behavior in a reconfigurable infrastructure and system implementing said method
US20110018697A1 (en) * 2009-07-22 2011-01-27 Immersion Corporation Interactive Touch Screen Gaming Metaphors With Haptic Feedback
US20120105312A1 (en) * 2010-10-29 2012-05-03 Microsoft Corporation User Input Device
US20130058623A1 (en) * 2011-09-07 2013-03-07 Vesstech, Inc. Video warning systems for devices, products, containers and other items

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170024987A1 (en) * 2015-07-22 2017-01-26 Che Wei Lin Device and system for security monitoring
US9998713B2 (en) * 2015-07-22 2018-06-12 Che Wei Lin Device and system for security monitoring
CN107170196A (en) * 2017-07-21 2017-09-15 京东方科技集团股份有限公司 Anticollision device, collision-prevention device, CAS and avoiding collision

Also Published As

Publication number Publication date
US20160163173A1 (en) 2016-06-09
US9292136B2 (en) 2016-03-22
US9552713B2 (en) 2017-01-24
US9911298B2 (en) 2018-03-06
US20140091937A1 (en) 2014-04-03

Similar Documents

Publication Publication Date Title
US9911298B2 (en) Notification system for providing awareness of an interactive surface
KR102385263B1 (en) Mobile home robot and controlling method of the mobile home robot
US9471141B1 (en) Context-aware notifications
US10444930B2 (en) Head-mounted display device and control method therefor
KR102195897B1 (en) Apparatus for dectecting aucoustic event, operating method thereof, and computer-readable recording medium having embodied thereon a program which when executed by a computer perorms the method
US10360876B1 (en) Displaying instances of visual content on a curved display
US9652053B2 (en) Method of displaying pointing information and device for performing the method
US9704361B1 (en) Projecting content within an environment
US9723293B1 (en) Identifying projection surfaces in augmented reality environments
TW201633226A (en) Social reminders
US11303591B2 (en) Management system for audio and visual content
US10936880B2 (en) Surveillance
US20210327146A1 (en) Virtual anchoring systems and methods for extended reality
US20140313858A1 (en) Ultrasonic location determination
US11874959B2 (en) Dynamic notification surfacing in virtual or augmented reality scenes
WO2014012186A1 (en) System and method for managing video analytics results
CN112005282A (en) Alarm for mixed reality devices
JP2019036181A (en) Information processing apparatus, information processing method and program
US11460994B2 (en) Information processing apparatus and information processing method
US11915571B2 (en) Systems and methods for dynamically monitoring distancing using a spatial monitoring platform
EP2856765B1 (en) Method and home device for outputting response to user input
US11375275B2 (en) Method and system for using lip sequences to control operations of a device
US20160091966A1 (en) Stereoscopic tracking status indicating method and display apparatus
US9774814B1 (en) Display device control system
US11657699B1 (en) Methods and systems for outputting alerts on user interfaces

Legal Events

Date Code Title Description
AS Assignment

Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PRATT, JAMES H.;BELZ, STEVEN M.;SULLIVAN, MARC A.;REEL/FRAME:041044/0537

Effective date: 20120926

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4