US20050059489A1 - Motion sensing applications - Google Patents

Motion sensing applications Download PDF

Info

Publication number
US20050059489A1
US20050059489A1 US10/803,655 US80365504A US2005059489A1 US 20050059489 A1 US20050059489 A1 US 20050059489A1 US 80365504 A US80365504 A US 80365504A US 2005059489 A1 US2005059489 A1 US 2005059489A1
Authority
US
United States
Prior art keywords
motions
user
motion sensing
position information
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/803,655
Inventor
Taek Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/661,732 external-priority patent/US20050073497A1/en
Application filed by Individual filed Critical Individual
Priority to US10/803,655 priority Critical patent/US20050059489A1/en
Publication of US20050059489A1 publication Critical patent/US20050059489A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/06Receivers
    • H04B1/16Circuits
    • H04B1/20Circuits for coupling gramophone pick-up, recorder output, or microphone to receiver
    • H04B1/202Circuits for coupling gramophone pick-up, recorder output, or microphone to receiver by remote control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/4448Receiver circuitry for the reception of television signals according to analogue transmission standards for frame-grabbing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera

Definitions

  • the present disclosure generally relates to motion sensing applications, and more specifically, to using GPS-based motion sensing in such applications.
  • electronic motion sensing has been configured with a plurality of accelerometers and at least one gyroscope to provide translation motion readings on three axes, and roll, pitch, and yaw readings.
  • This configuration can be used in various motion sensing applications, including virtual reality applications and other interactive games.
  • the motion sensing device should be placed or worn on the body of the player.
  • motion sensing devices configured with gyroscopes and accelerometers can be relatively heavy and bulky to be placed or worn on the body of the player.
  • motion sensing devices configured with built-in cameras are relatively expensive and require complex software to interpret the captured motions into useable digital signals.
  • a motion sensing device provides visual display of motions to a user.
  • the motion sensing device includes sensors, display, and an interface device. At least first and second sensors are operatively configured to provide position information of at least first and second points, respectively, on the motion sensing device.
  • the position information should be sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of the motion sensing device with respect to a visual axis of the user.
  • the interface device is coupled to the display and the sensors, and operates to transmit the motions of the motion sensing device to the display.
  • a gaming device in another aspect, includes sensors, display, and an interface device. At least first and second sensors are operatively configured to provide position information of at least first and second points, respectively. The position information should be sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user.
  • the interface device is configured to couple the sensors to the user so that the motions of the user can be visually displayed on the display.
  • a gaming method in a further aspect, includes providing at least first and second sensors to compute position information of at least first and second points, respectively.
  • the position information should be sufficiently accurate to distinguish the first point from the second point, such that the computed position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user.
  • the gaming method also includes coupling the sensors to the user so that the motions of the user can be visually displayed.
  • FIG. 1 shows a motion sensing device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a motion sensing device according to an embodiment of the present invention.
  • FIG. 3 illustrates a roll motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 4A illustrates movement of a player's head on a screen in response to the roll motion of the motion sensing device according to an embodiment of the present invention.
  • FIG. 4B through FIG. 4D illustrate movements of an entire display in response to the roll motion of the motion sensing device.
  • FIG. 5 illustrates a pitch motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 6A illustrates visual displays for the pitch motion.
  • FIG. 6B through FIG. 6D illustrate movements of an entire display in response to the pitch motion of the motion sensing device.
  • FIG. 7 illustrates a yaw motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 8A illustrates visual displays for the yaw motion.
  • FIG. 8B through FIG. 8D illustrate movements of an entire display in response to the yaw motion of the motion sensing device.
  • FIG. 9 illustrates a horizontal translation motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 10A illustrates visual displays for the horizontal translation motion.
  • FIG. 10B through FIG. 10F illustrate movements of an entire display in response to the horizontal translation motion of the motion sensing device.
  • FIG. 11 illustrates a vertical translation motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 12A illustrates visual displays for the vertical translation motion.
  • FIG. 12B through FIG. 12D illustrate movements of an entire display in response to the vertical translation motion of the motion sensing device.
  • FIG. 13A through FIG. 13C illustrate different implementations of the motion sensing device in accordance with various embodiments.
  • FIG. 14A through FIG. 14D illustrate different placements of the motion sensing device within the player's body.
  • a motion sensing device that can sense five degrees of freedom motions, which may include roll, pitch, and yaw directional motions, and horizontal and vertical translation motions. These motions can be illustrated on an electronic screen (e.g., a TV screen or a computer monitor) of a game or simulation as movements of an icon (e.g., an electronic depiction of a person) or as movements of an entire screen.
  • degrees of freedom motions which may include roll, pitch, and yaw directional motions, and horizontal and vertical translation motions.
  • FIG. 1 shows a motion sensing device 100 according to an embodiment of the present invention.
  • FIG. 1 also illustrates a block diagram of an external device 120 and a screen 122 that interfaces with the external device.
  • the external device 120 is a computer.
  • the external device 120 is a television.
  • the external device 120 includes any driver that can drive a display device to graphically illustrate the movement of the motion sensing device 100 .
  • the motion sensing device 100 includes an antenna 110 and a corresponding electronic circuitry, which are used to transmit or receive radio frequency signals to and from the external device 120 .
  • the motion sensing device 100 is configured as a headset to be worn on the player's head.
  • the sensors 102 , 104 of the device 100 sense the movement of the device 100 with respect to axis 106 or 108 (an axis that comes out of the page).
  • a motion sensing device can be configured as any apparatus having a plurality of sensors that can sense the movement of the device with respect to some axis fixedly related to the vision axis of the player.
  • the movement of the device 100 is measured in terms of movement of an axis (e.g., axis 106 ) linking the sensors 102 , 104 .
  • an axis e.g., axis 106
  • the movement of the axis 106 with respect to the axis 108 can be used to control and move a graphical icon, such as a person, or an entire image displayed on the screen 122 . Details of various motions of the device are described below.
  • Various motions of the motion sensing device 100 are visually fed back to a user by the movement of the graphical icon displayed on the screen 122 , or by the movement of the entire display shown on the screen 122 . Movement of the icon or the entire display copies the motions of the motion sensing device 100 . Thus, roll, pitch, yaw, horizontal translation, and vertical translation motions are combined and processed to produce a resultant movement of the icon or the entire display on the screen 122 .
  • FIG. 2 A block diagram of a motion sensing device 200 according to an embodiment of the present invention is shown in FIG. 2 .
  • the motion sensing device 200 comprises a main processor 202 and at least first and second sensors 220 , which are operatively configured to provide position information of at least first and second positions, such as 112 , 114 on the motion sensing device 100 of FIG. 1 .
  • the position information provided by the sensors 220 should be sufficiently accurate to distinguish the first position (e.g., position 112 ) from the second position (e.g., position 114 ), such that the provided position information of the first position with respect to the second position provides enough information to the processor 202 to determine roll, pitch, yaw, horizontal and vertical translation motions of the motion sensing device.
  • the main processor 202 receives the position information of the first and second positions.
  • the main processor 202 includes a motion converter 230 that processes the position information to determine angle and distance of the roll, pitch, yaw, horizontal and vertical translation motions.
  • the processor 202 also includes a movement converter 232 which converts these motions into an amount of icon or display movement on the main screen.
  • the main processor 202 interfaces with external devices (e.g., a computer 120 shown in FIG. 1 ) through a transceiver 208 and an antenna 210 .
  • the amount of icon or display movement is transmitted to an external device through the transceiver 208 .
  • the transceiver 208 also receives commands and messages from the external device.
  • FIGS. 3, 5 , 7 , 9 , and 11 illustrate various motions sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • the motion sensing device 300 is shown in a configuration in which a device axis 306 (i.e., the axis that connects the two sensors 310 , 312 ) makes a “roll” motion 302 with respect to the visual axis 308 (i.e., the axis that is coming out of the person's nose or eyes). Therefore, the “roll” motion 302 is defined in this specification as a counter-clockwise angular movement of the axis 306 linking the sensors 310 and 312 with respect to the visual axis 308 .
  • the axis 306 can be calculated by precisely computing the positions of the sensors 310 , 312 and taking the difference between the positions. By successively taking the differences as the sensors 310 , 312 move, the angular movement of the axis 306 with respect to the visual axis 308 can be calculated.
  • FIG. 4A illustrates movement of a player's head 400 on a screen in response to the roll motion of the motion sensing device according to an embodiment of the present invention.
  • the player's head 400 in FIG. 4A moves between directions B and D through direction C.
  • the roll motion of the motion sensing device can produce movement of different parts of the player or movement of icon other than the player, such as a directional arrow or cursor.
  • FIG. 4B through FIG. 4D illustrate movement of an entire display in response to the roll motion of the motion sensing device.
  • visual display of FIG. 4B corresponds to player vision in direction B
  • visual display of FIG. 4C corresponds to player vision in direction C
  • visual display of FIG. 4D corresponds to player vision in direction D.
  • FIG. 5 and FIGS. 6A through 6D illustrate visual displays for a “pitch” motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion.
  • the player's visual axis 500 makes a pitch motion 502
  • the player's head 600 in FIG. 6A moves up and down between directions B and D through direction C.
  • Visual displays in FIGS. 6B through 6D show objects in different pitch angles, or elevations, corresponding to directions B through D in FIG. 6A .
  • FIG. 7 and FIGS. 8A through 8D illustrate visual displays for a “yaw” motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion.
  • the player's visual axis 700 makes a yaw motion 702
  • the player's head 800 in FIG. 8A moves angularly sideways between directions B and D through direction C.
  • Visual displays in FIGS. 8B through 8D show objects in different yaw angles corresponding to directions B through D in FIG. 8A .
  • FIG. 9 and FIGS. 10A through 10D illustrate visual displays for a horizontal translation motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion.
  • FIGS. 10E and 10F illustrate further movements with the horizontal translation plane.
  • the player's visual axis 900 makes a horizontal translation motion 902 or 904
  • the player's head 1000 in FIG. 10A moves forward and backward between directions B and D through direction C, or move laterally sideways between directions E and F through direction C.
  • Visual displays in FIGS. 10B through 10F show objects in different horizontal positions corresponding to directions B through F in FIG. 10A .
  • FIG. 11 and FIGS. 12A through 12D illustrate visual displays for a vertical translation motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion.
  • the player's visual axis 1100 makes a pitch motion 1102
  • the player's head 1200 in FIG. 12A vertically moves up and down between directions B and D through direction C.
  • Visual displays in FIGS. 12B through 12D show objects in different vertical translation positions corresponding to directions B through D in FIG. 12A .
  • FIGS. 13A through 13C illustrate different implementations of the motion sensing device in accordance with various embodiments.
  • FIG. 13A illustrates the motion sensing device 1300 configured as a headset worn on the player's head.
  • FIG. 13B shows a similar configuration in which the sensors of the motion sensing device 1302 are in communication wirelessly.
  • FIG. 13C shows another configuration in which the motion sensing device 1304 is configured as a pair of visual-display glasses.
  • FIGS. 14A through 14D illustrate different placements of the motion sensing device within the player's body.
  • the motion sensing device can be configured for different games or simulations.
  • FIG. 14A shows the motion sensing device 1400 worn around the waist area of the player.
  • the motion sensing device 1400 can be used in various dancing games or other games that sense waist movement.
  • the device 1400 can be attached to the body of the user by a strap. In another embodiment, the device 1400 can be attached to the body of the user by an attachment element such as hook-and-loop devices.
  • the sensors of the device 1402 in FIG. 14B are worn around the wrists of the player.
  • the device 1402 can be used in a tennis game.
  • the sensors of the device 1404 in FIG. 14C are worn around the ankles of the player.
  • the device 1404 can be used in a soccer game.
  • the sensors of the device 1406 in FIG. 14D are configured as a combination of above-described configurations. Thus, this configuration can be used for games or simulations requiring motion inputs from multiple sources.
  • the processor 202 processes these motions in combination to provide a resultant movement to the icon or the entire-display for each instant in time.
  • the movement converter 232 processes the resultant movement received from the motion converter 230 to generate motion parameters (e.g., a position vector and an angle) to move the icon or the entire display correspondingly.
  • the motion sensing device described above can be use to provide visual display of user's motions in a game or simulation.

Abstract

A gaming device includes sensors, display, and an interface device. At least first and second sensors are operatively configured to provide position information of at least first and second points, respectively. The position information should be sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user. The interface device is configured to couple the sensors to the user so that the motions of the user can be visually displayed on the display.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation-in-part application of co-pending U.S. patent application Ser. No. 10/661,732, entitled “Remote Control Device Capable of Sensing Motion”, filed Sep. 12, 2003. Benefit of priority of the filing date of Sep. 12, 2003 is hereby claimed for common material, and the disclosure of the U.S. Patent Application is hereby incorporated by reference.
  • BACKGROUND
  • The present disclosure generally relates to motion sensing applications, and more specifically, to using GPS-based motion sensing in such applications.
  • Typically, electronic motion sensing has been configured with a plurality of accelerometers and at least one gyroscope to provide translation motion readings on three axes, and roll, pitch, and yaw readings. This configuration can be used in various motion sensing applications, including virtual reality applications and other interactive games.
  • Recently, several game developers (e.g., Sony Playstation) have been using built-in video cameras to optically capture and incorporate the motions of the player into the game. However, these motion sensing devices using accelerometers and gyroscopes, or video cameras can be cumbersome and expensive.
  • For example, to sense the motion of a player in an interactive game, the motion sensing device should be placed or worn on the body of the player. However, motion sensing devices configured with gyroscopes and accelerometers can be relatively heavy and bulky to be placed or worn on the body of the player. Further, motion sensing devices configured with built-in cameras are relatively expensive and require complex software to interpret the captured motions into useable digital signals.
  • SUMMARY
  • A motion sensing device provides visual display of motions to a user. In one aspect, the motion sensing device includes sensors, display, and an interface device. At least first and second sensors are operatively configured to provide position information of at least first and second points, respectively, on the motion sensing device. The position information should be sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of the motion sensing device with respect to a visual axis of the user. The interface device is coupled to the display and the sensors, and operates to transmit the motions of the motion sensing device to the display.
  • In another aspect, a gaming device includes sensors, display, and an interface device. At least first and second sensors are operatively configured to provide position information of at least first and second points, respectively. The position information should be sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user. The interface device is configured to couple the sensors to the user so that the motions of the user can be visually displayed on the display.
  • In a further aspect, a gaming method is disclosed. The gaming method includes providing at least first and second sensors to compute position information of at least first and second points, respectively. The position information should be sufficiently accurate to distinguish the first point from the second point, such that the computed position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user. The gaming method also includes coupling the sensors to the user so that the motions of the user can be visually displayed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Different aspects of the disclosure will be described in reference to the accompanying drawings.
  • FIG. 1 shows a motion sensing device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a motion sensing device according to an embodiment of the present invention.
  • FIG. 3 illustrates a roll motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 4A illustrates movement of a player's head on a screen in response to the roll motion of the motion sensing device according to an embodiment of the present invention.
  • FIG. 4B through FIG. 4D illustrate movements of an entire display in response to the roll motion of the motion sensing device.
  • FIG. 5 illustrates a pitch motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 6A illustrates visual displays for the pitch motion.
  • FIG. 6B through FIG. 6D illustrate movements of an entire display in response to the pitch motion of the motion sensing device.
  • FIG. 7 illustrates a yaw motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 8A illustrates visual displays for the yaw motion.
  • FIG. 8B through FIG. 8D illustrate movements of an entire display in response to the yaw motion of the motion sensing device.
  • FIG. 9 illustrates a horizontal translation motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 10A illustrates visual displays for the horizontal translation motion.
  • FIG. 10B through FIG. 10F illustrate movements of an entire display in response to the horizontal translation motion of the motion sensing device.
  • FIG. 11 illustrates a vertical translation motion sensed by a motion sensing device in accordance with an embodiment of the present invention.
  • FIG. 12A illustrates visual displays for the vertical translation motion.
  • FIG. 12B through FIG. 12D illustrate movements of an entire display in response to the vertical translation motion of the motion sensing device.
  • FIG. 13A through FIG. 13C illustrate different implementations of the motion sensing device in accordance with various embodiments.
  • FIG. 14A through FIG. 14D illustrate different placements of the motion sensing device within the player's body.
  • DETAILED DESCRIPTION
  • Various embodiments are described for a motion sensing device that can sense five degrees of freedom motions, which may include roll, pitch, and yaw directional motions, and horizontal and vertical translation motions. These motions can be illustrated on an electronic screen (e.g., a TV screen or a computer monitor) of a game or simulation as movements of an icon (e.g., an electronic depiction of a person) or as movements of an entire screen. The details of applications using the motion sensing device are described below.
  • FIG. 1 shows a motion sensing device 100 according to an embodiment of the present invention. FIG. 1 also illustrates a block diagram of an external device 120 and a screen 122 that interfaces with the external device. In one embodiment, the external device 120 is a computer. In another embodiment, the external device 120 is a television. In a further embodiment, the external device 120 includes any driver that can drive a display device to graphically illustrate the movement of the motion sensing device 100.
  • In the illustrated embodiment of FIG. 1, the motion sensing device 100 includes an antenna 110 and a corresponding electronic circuitry, which are used to transmit or receive radio frequency signals to and from the external device 120.
  • In FIG. 1, the motion sensing device 100 is configured as a headset to be worn on the player's head. The sensors 102, 104 of the device 100 sense the movement of the device 100 with respect to axis 106 or 108 (an axis that comes out of the page). In other implementations, a motion sensing device can be configured as any apparatus having a plurality of sensors that can sense the movement of the device with respect to some axis fixedly related to the vision axis of the player. These other implementations are described in detail below.
  • The movement of the device 100 is measured in terms of movement of an axis (e.g., axis 106) linking the sensors 102, 104. Thus, in FIG. 1, the movement of the axis 106 with respect to the axis 108 can be used to control and move a graphical icon, such as a person, or an entire image displayed on the screen 122. Details of various motions of the device are described below.
  • Various motions of the motion sensing device 100 are visually fed back to a user by the movement of the graphical icon displayed on the screen 122, or by the movement of the entire display shown on the screen 122. Movement of the icon or the entire display copies the motions of the motion sensing device 100. Thus, roll, pitch, yaw, horizontal translation, and vertical translation motions are combined and processed to produce a resultant movement of the icon or the entire display on the screen 122.
  • A block diagram of a motion sensing device 200 according to an embodiment of the present invention is shown in FIG. 2. The motion sensing device 200 comprises a main processor 202 and at least first and second sensors 220, which are operatively configured to provide position information of at least first and second positions, such as 112, 114 on the motion sensing device 100 of FIG. 1. The position information provided by the sensors 220 should be sufficiently accurate to distinguish the first position (e.g., position 112) from the second position (e.g., position 114), such that the provided position information of the first position with respect to the second position provides enough information to the processor 202 to determine roll, pitch, yaw, horizontal and vertical translation motions of the motion sensing device.
  • The main processor 202 receives the position information of the first and second positions. The main processor 202 includes a motion converter 230 that processes the position information to determine angle and distance of the roll, pitch, yaw, horizontal and vertical translation motions. The processor 202 also includes a movement converter 232 which converts these motions into an amount of icon or display movement on the main screen. The main processor 202 interfaces with external devices (e.g., a computer 120 shown in FIG. 1) through a transceiver 208 and an antenna 210. Thus, the amount of icon or display movement is transmitted to an external device through the transceiver 208. The transceiver 208 also receives commands and messages from the external device.
  • FIGS. 3, 5, 7, 9, and 11 illustrate various motions sensed by a motion sensing device in accordance with an embodiment of the present invention. In the illustrated embodiment of FIG. 3, the motion sensing device 300 is shown in a configuration in which a device axis 306 (i.e., the axis that connects the two sensors 310, 312) makes a “roll” motion 302 with respect to the visual axis 308 (i.e., the axis that is coming out of the person's nose or eyes). Therefore, the “roll” motion 302 is defined in this specification as a counter-clockwise angular movement of the axis 306 linking the sensors 310 and 312 with respect to the visual axis 308.
  • As will be describe further below, the axis 306 can be calculated by precisely computing the positions of the sensors 310, 312 and taking the difference between the positions. By successively taking the differences as the sensors 310, 312 move, the angular movement of the axis 306 with respect to the visual axis 308 can be calculated.
  • FIG. 4A illustrates movement of a player's head 400 on a screen in response to the roll motion of the motion sensing device according to an embodiment of the present invention. Thus, when the player makes a roll motion 302 as shown in FIG. 3, the player's head 400 in FIG. 4A moves between directions B and D through direction C. In other embodiments, the roll motion of the motion sensing device can produce movement of different parts of the player or movement of icon other than the player, such as a directional arrow or cursor.
  • In further embodiments, FIG. 4B through FIG. 4D illustrate movement of an entire display in response to the roll motion of the motion sensing device. For example, visual display of FIG. 4B corresponds to player vision in direction B, visual display of FIG. 4C corresponds to player vision in direction C, and visual display of FIG. 4D corresponds to player vision in direction D.
  • FIG. 5 and FIGS. 6A through 6D illustrate visual displays for a “pitch” motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion. Thus, in FIG. 5, the player's visual axis 500 makes a pitch motion 502, and the player's head 600 in FIG. 6A moves up and down between directions B and D through direction C. Visual displays in FIGS. 6B through 6D show objects in different pitch angles, or elevations, corresponding to directions B through D in FIG. 6A.
  • FIG. 7 and FIGS. 8A through 8D illustrate visual displays for a “yaw” motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion. Thus, in FIG. 7, the player's visual axis 700 makes a yaw motion 702, and the player's head 800 in FIG. 8A moves angularly sideways between directions B and D through direction C. Visual displays in FIGS. 8B through 8D show objects in different yaw angles corresponding to directions B through D in FIG. 8A.
  • FIG. 9 and FIGS. 10A through 10D illustrate visual displays for a horizontal translation motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion. FIGS. 10E and 10F illustrate further movements with the horizontal translation plane. Thus, in FIG. 9, the player's visual axis 900 makes a horizontal translation motion 902 or 904, and the player's head 1000 in FIG. 10A moves forward and backward between directions B and D through direction C, or move laterally sideways between directions E and F through direction C. Visual displays in FIGS. 10B through 10F show objects in different horizontal positions corresponding to directions B through F in FIG. 10A.
  • FIG. 11 and FIGS. 12A through 12D illustrate visual displays for a vertical translation motion that correspond to FIG. 3 and FIGS. 4A through 4D for the roll motion. Thus, in FIG. 11, the player's visual axis 1100 makes a pitch motion 1102, and the player's head 1200 in FIG. 12A vertically moves up and down between directions B and D through direction C. Visual displays in FIGS. 12B through 12D show objects in different vertical translation positions corresponding to directions B through D in FIG. 12A.
  • FIGS. 13A through 13C illustrate different implementations of the motion sensing device in accordance with various embodiments. For example, FIG. 13A illustrates the motion sensing device 1300 configured as a headset worn on the player's head. FIG. 13B shows a similar configuration in which the sensors of the motion sensing device 1302 are in communication wirelessly. FIG. 13C shows another configuration in which the motion sensing device 1304 is configured as a pair of visual-display glasses.
  • FIGS. 14A through 14D illustrate different placements of the motion sensing device within the player's body. Thus, depending on a particular placement, the motion sensing device can be configured for different games or simulations. For example, FIG. 14A shows the motion sensing device 1400 worn around the waist area of the player. Thus, in this configuration, the motion sensing device 1400 can be used in various dancing games or other games that sense waist movement.
  • In one embodiment, the device 1400 can be attached to the body of the user by a strap. In another embodiment, the device 1400 can be attached to the body of the user by an attachment element such as hook-and-loop devices.
  • The sensors of the device 1402 in FIG. 14B are worn around the wrists of the player. Thus, in this configuration, the device 1402 can be used in a tennis game. The sensors of the device 1404 in FIG. 14C are worn around the ankles of the player. Thus, in this configuration, the device 1404 can be used in a soccer game. The sensors of the device 1406 in FIG. 14D are configured as a combination of above-described configurations. Thus, this configuration can be used for games or simulations requiring motion inputs from multiple sources.
  • Various motions of the motion sensing device have been individually described above to illustrate the different possible movements of the player's visual axis. However, it should be understood that the processor 202, and in particular, the motion converter 230 processes these motions in combination to provide a resultant movement to the icon or the entire-display for each instant in time. As described above, the movement converter 232 processes the resultant movement received from the motion converter 230 to generate motion parameters (e.g., a position vector and an angle) to move the icon or the entire display correspondingly. Further, it should be understood that the motion sensing device described above can be use to provide visual display of user's motions in a game or simulation.
  • All these are intended to be encompassed by the following claims.

Claims (20)

1. A motion sensing device for providing visual display of motions to a user, comprising:
at least first and second sensors operatively configured to provide position information of at least first and second points, respectively, on the motion sensing device, the position information being sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of the motion sensing device with respect to a visual axis of the user;
a display; and
an interface device coupled to said display and said at least first and second sensors, said interface device operating to transmit the motions of the motion sensing device to said display.
2. The motion sensing-device of claim 1, wherein said at least first and second sensors are configured as a headset device.
3. The motion sensing device of claim 1, wherein said interface device includes at least one wireless transceiver.
4. The motion sensing device of claim 1, wherein said interface device includes a display driver.
5. The motion sensing device of claim 1, wherein said interface device includes a computer.
6. The motion sensing device of claim 1, wherein said display includes a computer monitor.
7. The motion sensing device of claim 1, wherein the motions of the motion sensing device includes roll, pitch, and yaw motions.
8. The motion sensing device of claim 1, wherein the motions of the motion sensing device includes vertical and horizontal translation motions.
9. A gaming device, comprising:
at least first and second sensors operatively configured to provide position information of at least first and second points, respectively, the position information being sufficiently accurate to distinguish the first point from the second point, such that the provided position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user;
a display; and
an interface device configured to couple said at least first and second sensors to the user so that the motions of the user can be visually displayed on said display.
10. The gaming device of claim 9, wherein said interface device includes a headset to be worn around the user's head to sense the motions of the user's head.
11. The gaming device of claim 9, wherein said interface device and said display include a pair of glasses configured to sense the motions of the user's head and display the motions on said pair of glasses.
12. The gaming device of claim 9, wherein said interface device includes an attachment device.
13. The gaming device of claim 12, wherein said attachment device is a waist strap.
14. The gaming device of claim 12, wherein said attachment device includes a wrist strap.
15. The gaming device of claim 12, wherein said attachment device includes an ankle hook-and-loop device.
16. A gaming method, comprising:
providing at least first and second sensors to compute position information of at least first and second points, respectively, the position information being sufficiently accurate to distinguish the first point from the second point, such that the computed position information of the first point with respect to the position information of the second point provides enough information to determine motions of a user; and
coupling said at least first and second sensors to the user so that the motions of the user can be visually displayed.
17. The gaming device of claim 16, wherein said coupling includes attaching a headset, containing said first and second sensors, to be worn around the user's head to sense the motions of the user's head.
18. The gaming device of claim 16, wherein said coupling includes wearing a pair of glasses configured to sense the motions of the user's head such that the motions of the user can be visually displayed on said pair of glasses.
19. The gaming device of claim 16, wherein said coupling includes strapping said first and second sensors to the user's waist.
20. The gaming device of claim 16, wherein said coupling includes strapping said first and second sensors to the user's wrist.
US10/803,655 2003-09-12 2004-03-17 Motion sensing applications Abandoned US20050059489A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/803,655 US20050059489A1 (en) 2003-09-12 2004-03-17 Motion sensing applications

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/661,732 US20050073497A1 (en) 2003-09-12 2003-09-12 Remote control device capable of sensing motion
US10/803,655 US20050059489A1 (en) 2003-09-12 2004-03-17 Motion sensing applications

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/661,732 Continuation-In-Part US20050073497A1 (en) 2003-09-12 2003-09-12 Remote control device capable of sensing motion

Publications (1)

Publication Number Publication Date
US20050059489A1 true US20050059489A1 (en) 2005-03-17

Family

ID=46301908

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/803,655 Abandoned US20050059489A1 (en) 2003-09-12 2004-03-17 Motion sensing applications

Country Status (1)

Country Link
US (1) US20050059489A1 (en)

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060046847A1 (en) * 2004-09-02 2006-03-02 Yoshihisa Hashimoto Pose detection method, video game apparatus, pose detection program, and computer-readable medium containing computer program
US20070072662A1 (en) * 2005-09-28 2007-03-29 Templeman James N Remote vehicle control system
US20080188277A1 (en) * 2007-02-01 2008-08-07 Ritter Janice E Electronic Game Device And Method Of Using The Same
US7602301B1 (en) 2006-01-09 2009-10-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100184499A1 (en) * 2007-02-01 2010-07-22 Ritter Janice E Electronic Game Device and Method of Using the Same
US20100309224A1 (en) * 2004-03-31 2010-12-09 Canon Kabushiki Kaisha Image displaying method, image displaying program, and display
US20110212876A1 (en) * 2010-02-10 2011-09-01 Michelle Meek Cleaning composition comprising amylase variants with high stability in the presence of a chelating agent
US8498100B1 (en) 2012-03-02 2013-07-30 Microsoft Corporation Flexible hinge and removable attachment
US8654030B1 (en) 2012-10-16 2014-02-18 Microsoft Corporation Antenna placement
US8719603B2 (en) 2012-03-02 2014-05-06 Microsoft Corporation Accessory device authentication
US8733423B1 (en) 2012-10-17 2014-05-27 Microsoft Corporation Metal alloy injection molding protrusions
US8749529B2 (en) 2012-03-01 2014-06-10 Microsoft Corporation Sensor-in-pixel display system with near infrared filter
US8786767B2 (en) 2012-11-02 2014-07-22 Microsoft Corporation Rapid synchronized lighting and shuttering
US8873227B2 (en) 2012-03-02 2014-10-28 Microsoft Corporation Flexible hinge support layer
US8947353B2 (en) 2012-06-12 2015-02-03 Microsoft Corporation Photosensor array gesture detection
US8949477B2 (en) 2012-05-14 2015-02-03 Microsoft Technology Licensing, Llc Accessory device architecture
US8952892B2 (en) 2012-11-01 2015-02-10 Microsoft Corporation Input location correction tables for input panels
US8964379B2 (en) 2012-08-20 2015-02-24 Microsoft Corporation Switchable magnetic lock
US9019615B2 (en) 2012-06-12 2015-04-28 Microsoft Technology Licensing, Llc Wide field-of-view virtual image projector
US9027631B2 (en) 2012-10-17 2015-05-12 Microsoft Technology Licensing, Llc Metal alloy injection molding overflows
US9052414B2 (en) 2012-02-07 2015-06-09 Microsoft Technology Licensing, Llc Virtual image device
US9064654B2 (en) 2012-03-02 2015-06-23 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9073123B2 (en) 2012-06-13 2015-07-07 Microsoft Technology Licensing, Llc Housing vents
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9152173B2 (en) 2012-10-09 2015-10-06 Microsoft Technology Licensing, Llc Transparent display device
US9176538B2 (en) 2013-02-05 2015-11-03 Microsoft Technology Licensing, Llc Input device configurations
US9201185B2 (en) 2011-02-04 2015-12-01 Microsoft Technology Licensing, Llc Directional backlighting for display panels
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US9256089B2 (en) 2012-06-15 2016-02-09 Microsoft Technology Licensing, Llc Object-detecting backlight unit
US9304549B2 (en) 2013-03-28 2016-04-05 Microsoft Technology Licensing, Llc Hinge mechanism for rotatable component attachment
US9317072B2 (en) 2014-01-28 2016-04-19 Microsoft Technology Licensing, Llc Hinge mechanism with preset positions
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US9355345B2 (en) 2012-07-23 2016-05-31 Microsoft Technology Licensing, Llc Transparent tags with encoded data
US9360893B2 (en) 2012-03-02 2016-06-07 Microsoft Technology Licensing, Llc Input device writing surface
US9426905B2 (en) 2012-03-02 2016-08-23 Microsoft Technology Licensing, Llc Connection device for computing devices
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US9447620B2 (en) 2014-09-30 2016-09-20 Microsoft Technology Licensing, Llc Hinge mechanism with multiple preset positions
US9459160B2 (en) 2012-06-13 2016-10-04 Microsoft Technology Licensing, Llc Input device sensor configuration
US9513748B2 (en) 2012-12-13 2016-12-06 Microsoft Technology Licensing, Llc Combined display panel circuit
US9552777B2 (en) 2013-05-10 2017-01-24 Microsoft Technology Licensing, Llc Phase control backlight
US9638835B2 (en) 2013-03-05 2017-05-02 Microsoft Technology Licensing, Llc Asymmetric aberration correcting lens
US9661770B2 (en) 2012-10-17 2017-05-23 Microsoft Technology Licensing, Llc Graphic formation via material ablation
US9684382B2 (en) 2012-06-13 2017-06-20 Microsoft Technology Licensing, Llc Input device configuration having capacitive and pressure sensors
US9752361B2 (en) 2015-06-18 2017-09-05 Microsoft Technology Licensing, Llc Multistage hinge
US9759854B2 (en) 2014-02-17 2017-09-12 Microsoft Technology Licensing, Llc Input device outer layer and backlighting
US9814982B2 (en) 2015-02-25 2017-11-14 Globalfoundries Inc. Mitigating collisions in a physical space during gaming
US9864415B2 (en) 2015-06-30 2018-01-09 Microsoft Technology Licensing, Llc Multistage friction hinge
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US10031556B2 (en) 2012-06-08 2018-07-24 Microsoft Technology Licensing, Llc User experience adaptation
US10037057B2 (en) 2016-09-22 2018-07-31 Microsoft Technology Licensing, Llc Friction hinge
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
US10120420B2 (en) 2014-03-21 2018-11-06 Microsoft Technology Licensing, Llc Lockable display and techniques enabling use of lockable displays
US10156889B2 (en) 2014-09-15 2018-12-18 Microsoft Technology Licensing, Llc Inductive peripheral retention device
US10222889B2 (en) 2015-06-03 2019-03-05 Microsoft Technology Licensing, Llc Force inputs and cursor control
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US10344797B2 (en) 2016-04-05 2019-07-09 Microsoft Technology Licensing, Llc Hinge with multiple preset positions
US10416799B2 (en) 2015-06-03 2019-09-17 Microsoft Technology Licensing, Llc Force sensing and inadvertent input control of an input device
US10578499B2 (en) 2013-02-17 2020-03-03 Microsoft Technology Licensing, Llc Piezo-actuated virtual buttons for touch surfaces
USRE48963E1 (en) 2012-03-02 2022-03-08 Microsoft Technology Licensing, Llc Connection device for computing devices
US11944428B2 (en) 2015-11-30 2024-04-02 Nike, Inc. Apparel with ultrasonic position sensing and haptic feedback for activities

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4340378A (en) * 1980-09-08 1982-07-20 The Gates Rubber Company V-Block component and belt
US4630817A (en) * 1983-12-20 1986-12-23 Gym Bee Enterprises, Inc. Recreation apparatus
US4930888A (en) * 1987-11-07 1990-06-05 Messerschmitt-Boelkow-Blohm Gesellschaft Mit Beschraenkter Haftung Situation display system for attachment to a headgear
US6000942A (en) * 1996-09-17 1999-12-14 Systems Technology, Inc. Parachute flight training simulator
US6159100A (en) * 1998-04-23 2000-12-12 Smith; Michael D. Virtual reality game
US6270414B2 (en) * 1997-12-31 2001-08-07 U.S. Philips Corporation Exoskeletal platform for controlling multi-directional avatar kinetics in a virtual environment
US6347999B1 (en) * 1999-11-18 2002-02-19 Jay C. Yuan Pinball simulator game system
US6396462B1 (en) * 1996-04-05 2002-05-28 Fakespace Labs, Inc. Gimbal-mounted virtual reality display system
US6471586B1 (en) * 1998-11-17 2002-10-29 Namco, Ltd. Game system and information storage medium
US6716106B2 (en) * 2002-05-08 2004-04-06 Via Technologies, Inc. Real-scene tour simulation system and method of the same

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4340378A (en) * 1980-09-08 1982-07-20 The Gates Rubber Company V-Block component and belt
US4630817A (en) * 1983-12-20 1986-12-23 Gym Bee Enterprises, Inc. Recreation apparatus
US4930888A (en) * 1987-11-07 1990-06-05 Messerschmitt-Boelkow-Blohm Gesellschaft Mit Beschraenkter Haftung Situation display system for attachment to a headgear
US6396462B1 (en) * 1996-04-05 2002-05-28 Fakespace Labs, Inc. Gimbal-mounted virtual reality display system
US6000942A (en) * 1996-09-17 1999-12-14 Systems Technology, Inc. Parachute flight training simulator
US6270414B2 (en) * 1997-12-31 2001-08-07 U.S. Philips Corporation Exoskeletal platform for controlling multi-directional avatar kinetics in a virtual environment
US6159100A (en) * 1998-04-23 2000-12-12 Smith; Michael D. Virtual reality game
US6471586B1 (en) * 1998-11-17 2002-10-29 Namco, Ltd. Game system and information storage medium
US6347999B1 (en) * 1999-11-18 2002-02-19 Jay C. Yuan Pinball simulator game system
US6716106B2 (en) * 2002-05-08 2004-04-06 Via Technologies, Inc. Real-scene tour simulation system and method of the same

Cited By (143)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100309224A1 (en) * 2004-03-31 2010-12-09 Canon Kabushiki Kaisha Image displaying method, image displaying program, and display
US9086790B2 (en) * 2004-03-31 2015-07-21 Canon Kabushiki Kaisha Image displaying method, image displaying program, and display
US20060046847A1 (en) * 2004-09-02 2006-03-02 Yoshihisa Hashimoto Pose detection method, video game apparatus, pose detection program, and computer-readable medium containing computer program
US7559841B2 (en) * 2004-09-02 2009-07-14 Sega Corporation Pose detection method, video game apparatus, pose detection program, and computer-readable medium containing computer program
US7731588B2 (en) * 2005-09-28 2010-06-08 The United States Of America As Represented By The Secretary Of The Navy Remote vehicle control system
US20070072662A1 (en) * 2005-09-28 2007-03-29 Templeman James N Remote vehicle control system
US7978081B2 (en) 2006-01-09 2011-07-12 Applied Technology Holdings, Inc. Apparatus, systems, and methods for communicating biometric and biomechanical information
US9907997B2 (en) 2006-01-09 2018-03-06 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11819324B2 (en) 2006-01-09 2023-11-21 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11717185B2 (en) 2006-01-09 2023-08-08 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100201512A1 (en) * 2006-01-09 2010-08-12 Harold Dan Stirling Apparatus, systems, and methods for evaluating body movements
US20100204616A1 (en) * 2006-01-09 2010-08-12 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100201500A1 (en) * 2006-01-09 2010-08-12 Harold Dan Stirling Apparatus, systems, and methods for communicating biometric and biomechanical information
US7821407B2 (en) 2006-01-09 2010-10-26 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US7825815B2 (en) 2006-01-09 2010-11-02 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100121227A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US7602301B1 (en) 2006-01-09 2009-10-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100117837A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100121228A1 (en) * 2006-01-09 2010-05-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US10675507B2 (en) 2006-01-09 2020-06-09 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11399758B2 (en) 2006-01-09 2022-08-02 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11653856B2 (en) 2006-01-09 2023-05-23 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US11452914B2 (en) 2006-01-09 2022-09-27 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20100184499A1 (en) * 2007-02-01 2010-07-22 Ritter Janice E Electronic Game Device and Method of Using the Same
US8651953B2 (en) 2007-02-01 2014-02-18 Mattel, Inc. Electronic game device and method of using the same
US20080188277A1 (en) * 2007-02-01 2008-08-07 Ritter Janice E Electronic Game Device And Method Of Using The Same
US20100311485A1 (en) * 2007-02-01 2010-12-09 Mattel, Inc. Electronic Game Device and Method of Using the Same
US20110212876A1 (en) * 2010-02-10 2011-09-01 Michelle Meek Cleaning composition comprising amylase variants with high stability in the presence of a chelating agent
US9201185B2 (en) 2011-02-04 2015-12-01 Microsoft Technology Licensing, Llc Directional backlighting for display panels
US9052414B2 (en) 2012-02-07 2015-06-09 Microsoft Technology Licensing, Llc Virtual image device
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US8749529B2 (en) 2012-03-01 2014-06-10 Microsoft Corporation Sensor-in-pixel display system with near infrared filter
US9176900B2 (en) 2012-03-02 2015-11-03 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US8614666B2 (en) 2012-03-02 2013-12-24 Microsoft Corporation Sensing user input at display area edge
US8780540B2 (en) 2012-03-02 2014-07-15 Microsoft Corporation Flexible hinge and removable attachment
US8498100B1 (en) 2012-03-02 2013-07-30 Microsoft Corporation Flexible hinge and removable attachment
US8791382B2 (en) 2012-03-02 2014-07-29 Microsoft Corporation Input device securing techniques
US8830668B2 (en) 2012-03-02 2014-09-09 Microsoft Corporation Flexible hinge and removable attachment
US8850241B2 (en) 2012-03-02 2014-09-30 Microsoft Corporation Multi-stage power adapter configured to provide low power upon initial connection of the power adapter to the host device and high power thereafter upon notification from the host device to the power adapter
US8854799B2 (en) 2012-03-02 2014-10-07 Microsoft Corporation Flux fountain
US8873227B2 (en) 2012-03-02 2014-10-28 Microsoft Corporation Flexible hinge support layer
US8896993B2 (en) 2012-03-02 2014-11-25 Microsoft Corporation Input device layers and nesting
US8903517B2 (en) 2012-03-02 2014-12-02 Microsoft Corporation Computer device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices
US8935774B2 (en) 2012-03-02 2015-01-13 Microsoft Corporation Accessory device authentication
US8947864B2 (en) 2012-03-02 2015-02-03 Microsoft Corporation Flexible hinge and removable attachment
US8543227B1 (en) 2012-03-02 2013-09-24 Microsoft Corporation Sensor fusion algorithm
US8548608B2 (en) 2012-03-02 2013-10-01 Microsoft Corporation Sensor fusion algorithm
US8564944B2 (en) 2012-03-02 2013-10-22 Microsoft Corporation Flux fountain
US8570725B2 (en) 2012-03-02 2013-10-29 Microsoft Corporation Flexible hinge and removable attachment
USRE48963E1 (en) 2012-03-02 2022-03-08 Microsoft Technology Licensing, Llc Connection device for computing devices
US10963087B2 (en) 2012-03-02 2021-03-30 Microsoft Technology Licensing, Llc Pressure sensitive keys
US8610015B2 (en) 2012-03-02 2013-12-17 Microsoft Corporation Input device securing techniques
US9047207B2 (en) 2012-03-02 2015-06-02 Microsoft Technology Licensing, Llc Mobile device power state
US8724302B2 (en) 2012-03-02 2014-05-13 Microsoft Corporation Flexible hinge support layer
US9064654B2 (en) 2012-03-02 2015-06-23 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US10013030B2 (en) 2012-03-02 2018-07-03 Microsoft Technology Licensing, Llc Multiple position input device cover
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US8719603B2 (en) 2012-03-02 2014-05-06 Microsoft Corporation Accessory device authentication
US9098117B2 (en) 2012-03-02 2015-08-04 Microsoft Technology Licensing, Llc Classifying the intent of user input
US9946307B2 (en) 2012-03-02 2018-04-17 Microsoft Technology Licensing, Llc Classifying the intent of user input
US9111703B2 (en) 2012-03-02 2015-08-18 Microsoft Technology Licensing, Llc Sensor stack venting
US9116550B2 (en) 2012-03-02 2015-08-25 Microsoft Technology Licensing, Llc Device kickstand
US9134808B2 (en) 2012-03-02 2015-09-15 Microsoft Technology Licensing, Llc Device kickstand
US9134807B2 (en) 2012-03-02 2015-09-15 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9146620B2 (en) 2012-03-02 2015-09-29 Microsoft Technology Licensing, Llc Input device assembly
US9904327B2 (en) 2012-03-02 2018-02-27 Microsoft Technology Licensing, Llc Flexible hinge and removable attachment
US9158384B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Flexible hinge protrusion attachment
US9158383B2 (en) 2012-03-02 2015-10-13 Microsoft Technology Licensing, Llc Force concentrator
US9176901B2 (en) 2012-03-02 2015-11-03 Microsoft Technology Licensing, Llc Flux fountain
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US8699215B2 (en) 2012-03-02 2014-04-15 Microsoft Corporation Flexible hinge spine
US9852855B2 (en) 2012-03-02 2017-12-26 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9793073B2 (en) 2012-03-02 2017-10-17 Microsoft Technology Licensing, Llc Backlighting a fabric enclosure of a flexible cover
US9766663B2 (en) 2012-03-02 2017-09-19 Microsoft Technology Licensing, Llc Hinge for component attachment
US9268373B2 (en) 2012-03-02 2016-02-23 Microsoft Technology Licensing, Llc Flexible hinge spine
US9275809B2 (en) 2012-03-02 2016-03-01 Microsoft Technology Licensing, Llc Device camera angle
US9298236B2 (en) 2012-03-02 2016-03-29 Microsoft Technology Licensing, Llc Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter
US9304949B2 (en) 2012-03-02 2016-04-05 Microsoft Technology Licensing, Llc Sensing user input at display area edge
US9710093B2 (en) 2012-03-02 2017-07-18 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9304948B2 (en) 2012-03-02 2016-04-05 Microsoft Technology Licensing, Llc Sensing user input at display area edge
US9678542B2 (en) 2012-03-02 2017-06-13 Microsoft Technology Licensing, Llc Multiple position input device cover
US9618977B2 (en) 2012-03-02 2017-04-11 Microsoft Technology Licensing, Llc Input device securing techniques
US8646999B2 (en) 2012-03-02 2014-02-11 Microsoft Corporation Pressure sensitive key normalization
US9619071B2 (en) 2012-03-02 2017-04-11 Microsoft Technology Licensing, Llc Computing device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices
US9360893B2 (en) 2012-03-02 2016-06-07 Microsoft Technology Licensing, Llc Input device writing surface
US9411751B2 (en) 2012-03-02 2016-08-09 Microsoft Technology Licensing, Llc Key formation
US9426905B2 (en) 2012-03-02 2016-08-23 Microsoft Technology Licensing, Llc Connection device for computing devices
US9465412B2 (en) 2012-03-02 2016-10-11 Microsoft Technology Licensing, Llc Input device layers and nesting
US9460029B2 (en) 2012-03-02 2016-10-04 Microsoft Technology Licensing, Llc Pressure sensitive keys
US9348605B2 (en) 2012-05-14 2016-05-24 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes human interface device (HID) data via intermediate processor
US9098304B2 (en) 2012-05-14 2015-08-04 Microsoft Technology Licensing, Llc Device enumeration support method for computing devices that does not natively support device enumeration
US9959241B2 (en) 2012-05-14 2018-05-01 Microsoft Technology Licensing, Llc System and method for accessory device architecture that passes via intermediate processor a descriptor when processing in a low power state
US8949477B2 (en) 2012-05-14 2015-02-03 Microsoft Technology Licensing, Llc Accessory device architecture
US10031556B2 (en) 2012-06-08 2018-07-24 Microsoft Technology Licensing, Llc User experience adaptation
US10107994B2 (en) 2012-06-12 2018-10-23 Microsoft Technology Licensing, Llc Wide field-of-view virtual image projector
US9019615B2 (en) 2012-06-12 2015-04-28 Microsoft Technology Licensing, Llc Wide field-of-view virtual image projector
US8947353B2 (en) 2012-06-12 2015-02-03 Microsoft Corporation Photosensor array gesture detection
US10228770B2 (en) 2012-06-13 2019-03-12 Microsoft Technology Licensing, Llc Input device configuration having capacitive and pressure sensors
US9073123B2 (en) 2012-06-13 2015-07-07 Microsoft Technology Licensing, Llc Housing vents
US9684382B2 (en) 2012-06-13 2017-06-20 Microsoft Technology Licensing, Llc Input device configuration having capacitive and pressure sensors
US9952106B2 (en) 2012-06-13 2018-04-24 Microsoft Technology Licensing, Llc Input device sensor configuration
US9459160B2 (en) 2012-06-13 2016-10-04 Microsoft Technology Licensing, Llc Input device sensor configuration
US9256089B2 (en) 2012-06-15 2016-02-09 Microsoft Technology Licensing, Llc Object-detecting backlight unit
US9355345B2 (en) 2012-07-23 2016-05-31 Microsoft Technology Licensing, Llc Transparent tags with encoded data
US9753540B2 (en) 2012-08-02 2017-09-05 Immersion Corporation Systems and methods for haptic remote control gaming
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US8964379B2 (en) 2012-08-20 2015-02-24 Microsoft Corporation Switchable magnetic lock
US9824808B2 (en) 2012-08-20 2017-11-21 Microsoft Technology Licensing, Llc Switchable magnetic lock
US9152173B2 (en) 2012-10-09 2015-10-06 Microsoft Technology Licensing, Llc Transparent display device
US9432070B2 (en) 2012-10-16 2016-08-30 Microsoft Technology Licensing, Llc Antenna placement
US8654030B1 (en) 2012-10-16 2014-02-18 Microsoft Corporation Antenna placement
US8733423B1 (en) 2012-10-17 2014-05-27 Microsoft Corporation Metal alloy injection molding protrusions
US8991473B2 (en) 2012-10-17 2015-03-31 Microsoft Technology Holding, LLC Metal alloy injection molding protrusions
US9027631B2 (en) 2012-10-17 2015-05-12 Microsoft Technology Licensing, Llc Metal alloy injection molding overflows
US9661770B2 (en) 2012-10-17 2017-05-23 Microsoft Technology Licensing, Llc Graphic formation via material ablation
US8952892B2 (en) 2012-11-01 2015-02-10 Microsoft Corporation Input location correction tables for input panels
US9544504B2 (en) 2012-11-02 2017-01-10 Microsoft Technology Licensing, Llc Rapid synchronized lighting and shuttering
US8786767B2 (en) 2012-11-02 2014-07-22 Microsoft Corporation Rapid synchronized lighting and shuttering
US9513748B2 (en) 2012-12-13 2016-12-06 Microsoft Technology Licensing, Llc Combined display panel circuit
US9176538B2 (en) 2013-02-05 2015-11-03 Microsoft Technology Licensing, Llc Input device configurations
US10578499B2 (en) 2013-02-17 2020-03-03 Microsoft Technology Licensing, Llc Piezo-actuated virtual buttons for touch surfaces
US9638835B2 (en) 2013-03-05 2017-05-02 Microsoft Technology Licensing, Llc Asymmetric aberration correcting lens
US9304549B2 (en) 2013-03-28 2016-04-05 Microsoft Technology Licensing, Llc Hinge mechanism for rotatable component attachment
US9552777B2 (en) 2013-05-10 2017-01-24 Microsoft Technology Licensing, Llc Phase control backlight
US10359848B2 (en) 2013-12-31 2019-07-23 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US9317072B2 (en) 2014-01-28 2016-04-19 Microsoft Technology Licensing, Llc Hinge mechanism with preset positions
US9759854B2 (en) 2014-02-17 2017-09-12 Microsoft Technology Licensing, Llc Input device outer layer and backlighting
US10120420B2 (en) 2014-03-21 2018-11-06 Microsoft Technology Licensing, Llc Lockable display and techniques enabling use of lockable displays
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US10156889B2 (en) 2014-09-15 2018-12-18 Microsoft Technology Licensing, Llc Inductive peripheral retention device
US9447620B2 (en) 2014-09-30 2016-09-20 Microsoft Technology Licensing, Llc Hinge mechanism with multiple preset positions
US9964998B2 (en) 2014-09-30 2018-05-08 Microsoft Technology Licensing, Llc Hinge mechanism with multiple preset positions
US9814982B2 (en) 2015-02-25 2017-11-14 Globalfoundries Inc. Mitigating collisions in a physical space during gaming
US10222889B2 (en) 2015-06-03 2019-03-05 Microsoft Technology Licensing, Llc Force inputs and cursor control
US10416799B2 (en) 2015-06-03 2019-09-17 Microsoft Technology Licensing, Llc Force sensing and inadvertent input control of an input device
US9752361B2 (en) 2015-06-18 2017-09-05 Microsoft Technology Licensing, Llc Multistage hinge
US10606322B2 (en) 2015-06-30 2020-03-31 Microsoft Technology Licensing, Llc Multistage friction hinge
US9864415B2 (en) 2015-06-30 2018-01-09 Microsoft Technology Licensing, Llc Multistage friction hinge
US11944428B2 (en) 2015-11-30 2024-04-02 Nike, Inc. Apparel with ultrasonic position sensing and haptic feedback for activities
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
US10344797B2 (en) 2016-04-05 2019-07-09 Microsoft Technology Licensing, Llc Hinge with multiple preset positions
US10037057B2 (en) 2016-09-22 2018-07-31 Microsoft Technology Licensing, Llc Friction hinge

Similar Documents

Publication Publication Date Title
US20050059489A1 (en) Motion sensing applications
US10001833B2 (en) User input system for immersive interaction
US9317108B2 (en) Hand-held wireless electronic device with accelerometer for interacting with a display
CA2747814C (en) Hands-free pointer system
KR101958778B1 (en) A Head Mounted Display and a Method for Controlling a Digital Device Using the Same
US6757068B2 (en) Self-referenced tracking
EP3673345B1 (en) System and method for distributed device tracking
Foxlin et al. Weartrack: A self-referenced head and hand tracker for wearable computers and portable vr
CN108292490B (en) Display control device and display control method
CN107646098A (en) System for tracking portable equipment in virtual reality
US9013396B2 (en) System and method for controlling a virtual reality environment by an actor in the virtual reality environment
US20160171780A1 (en) Computer device in form of wearable glasses and user interface thereof
US20130002559A1 (en) Desktop computer user interface
JP2023129717A (en) Head-mounted information processing apparatus and control method thereof
KR101665027B1 (en) Head tracking bar system for head mount display
KR101530340B1 (en) Motion sensing system for implementing hand position-posture information of user in a three-dimensional virtual space based on a combined motion tracker and ahrs system
US20230256297A1 (en) Virtual evaluation tools for augmented reality exercise experiences
US20220219075A1 (en) Calibration system and method for handheld controller
WO2022113834A1 (en) System, imaging device, information processing device, information processing method, and information processing program
JP7462591B2 (en) Display control device and display control method
US20240112421A1 (en) System and method of object tracking for extended reality environment
EP4206867A1 (en) Peripheral tracking system and method
US11960635B2 (en) Virtual object display device and virtual object display method
JP2023027077A (en) Display control device and display control method
CN117784926A (en) Control device, control method, and computer-readable storage medium

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION