WO2006098770A2 - Apparatus and method for performing motion capture using shutter synchronization - Google Patents
Apparatus and method for performing motion capture using shutter synchronization Download PDFInfo
- Publication number
- WO2006098770A2 WO2006098770A2 PCT/US2005/034524 US2005034524W WO2006098770A2 WO 2006098770 A2 WO2006098770 A2 WO 2006098770A2 US 2005034524 W US2005034524 W US 2005034524W WO 2006098770 A2 WO2006098770 A2 WO 2006098770A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cameras
- light source
- performer
- shutters
- phosphorescent paint
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V8/00—Prospecting or detecting by optical means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/75—Circuitry for compensating brightness variation in the scene by influencing optical camera components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Definitions
- This invention relates generally to the field of motion capture. More particularly, the invention relates to an improved apparatus and method for performing motion capture using shutter synchronization and/or using phosphorescent paint.
- Motion capture refers generally to the tracking and recording of human and animal motion. Motion capture systems are used for a variety of applications including, for example, video games and computer-generated movies. In a typical motion capture session, the motion of a "performer" is captured and translated to a computer-generated character.
- a plurality of motion tracking "markers” are attached at various points on a performer's 100's body.
- the points are selected based on the known limitations of the human skeleton.
- Different types of motion capture markers are used for different motion capture systems.
- the motion markers attached to the performer are active coils which generate measurable disruptions x, y, z and yaw, pitch, roll in a magnetic field.
- the markers 101 , 102 are passive spheres comprised of retro-reflective material, i.e., a material which reflects light back in the direction from which it came, ideally over a wide range of angles of incidence.
- retro-reflective material i.e., a material which reflects light back in the direction from which it came, ideally over a wide range of angles of incidence.
- a plurality of cameras 120, 121 ,122, each with a ring of LEDs 130, 131, 132 around its lens, are positioned to capture the LED light reflected back from the retro-reflective markers 101 , 102 and other markers on the performer.
- the retro- reflected LED light is much brighter than any other light source in the room.
- a thresholding function is applied by the cameras 120, 121,122 to reject all light below a specified level of brightness which, ideally, isolates the light reflected off of the reflective markers from any other light in the room and the cameras 120, 121 , 122 only capture the light from the markers 101 , 102 and other markers on the performer.
- a motion tracking unit 150 coupled to the cameras is programmed with the relative position of each of the markers 101 , 102 and/or the known limitations of the performer's body. Using this information and the visual data provided from the cameras 120-122, the motion tracking unit 150 generates artificial motion data representing the movement of the performer during the motion capture session.
- a graphics processing unit 152 renders an animated representation of the performer on a computer display 160 (or similar display device) using the motion data.
- the graphics processing unit 152 may apply the captured motion of the performer to different animated characters and/or to include the animated characters in different computer-generated scenes.
- the motion tracking unit 150 and the graphics processing unit 152 are programmable cards coupled to the bus of a computer (e.g., such as the PCI and AGP buses found in many personal computers).
- One well known company which produces motion capture systems is Motion Analysis Corporation (see, e.g., www.motionanalysis.com).
- a method comprising: applying phosphorescent paint to specified regions of a performer's face and/or body; strobing a light source on and off, the light source charging the phosphorescent paint when on; and strobing the shutters of a first plurality of cameras synchronously with the strobing of the light source to capture images of the phosphorescent paint, wherein the shutters are open when the light source is off and the shutters are closed when the light source is open.
- FIG. 1 illustrates a prior art motion tracking system for tracking the motion of a performer using retro-reflective markers and cameras.
- FIG. 2 illustrates one embodiment of the invention which employs a curve pattern to track facial expression.
- FIG. 3 illustrates one embodiment of the invention which synchronizes light panels and camera shutters.
- FIG. 4 is a timing diagram illustrating the synchronization between the light panels and the shutters according to one embodiment of the invention.
- FIG. 5 is a schematic representation of an exemplary LED array and the connectors for the synchronization signals.
- FIG. 6a illustrates a set of exemplary illuminated curves painted on a performer's face during a lit frame.
- FIG. 6b illustrates a set of exemplary illuminated curves painted on a performer's face during a "glow" frame.
- FIG. 7 is a timing diagram illustrating the synchronization between the light panels and the camera shutters in an embodiment for capturing both lit frames and glow frames.
- FIG. 8 is a timing diagram illustrating the synchronization between the light panels and the camera shutters in another embodiment for capturing both lit frames and glow frames.
- FIG. 9 illustrates one embodiment of a system for capturing both lit frames and glow frames.
- FIG. 10 illustrates a timing diagram associated with the system shown in FIG. 9. DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
- Figure 2 illustrates an exemplary motion capture system described in the co-pending applications in which a predefined facial curve pattern 201 is adjusted to fit the topology of each performer's face 202.
- the three-dimensional (3-D) curve pattern is adjusted based on a 3-D map of the topology of the performer's face captured using a 3-D scanning system.
- the curves defined by the curve pattern 201 are painted on the face of the performer using retro-reflective, non-toxic paint or theatrical makeup. As described in detail below, in one embodiment of the invention, non-toxic phosphorescent paint is used to create the curves.
- each curve painted on the performer's face has a unique identifying name and/or number (to support systematic data processing) and potentially a color that can be easily identified by the optical capture system.
- the curve pattern is tracked by a motion capture processing system 210 comprised of one or more camera controllers 205 and a central motion capture controller 206 during the course of a performance.
- each of the camera controllers 205 and central motion capture controller 206 is implemented using a separate computer system.
- the cameral controllers and motion capture controller may be implemented as software executed on a single computer system or as any combination of hardware and software.
- each of the camera controllers 205 and/or the motion capture controller 206 is programmed with data 203 representing the curve pattern 201.
- the motion capture system 210 uses this information to trace the movement of each curve within the curve pattern during a performance. For example, the performer's facial expressions provided by each of the cameras 204 (e.g., as bitmap images) are analyzed and the curves identified using the defined curve pattern.
- the curve data 203 is provided to the motion capture system in the form of a "connectivity map," which is a text file representation of the curve pattern 201 which includes a list of all curves in the pattern and a list of all surface patches in the pattern, with each patch defined by its bounding curves. It is used by the camera controllers 205 and/or the central motion capture controller 206 to identify curves and intersections in the optically captured data. This, in turn, allows point data from the curves to be organized into surface patches and ultimately the triangulated mesh of a final 3-D geometry 207.
- the efficiency of the motion capture system is improved by using phosphorescent paint and/or by precisely controlling synchronization between the cameras' shutters and the illumination of the painted curves. More specifically, referring to Figure 3, in one embodiment of the invention, the predefined facial curve pattern 301 is painted on the performer's face 202 using phosphorescent paint.
- light panels 308-309 e.g., LED arrays
- the synchronization between the light panels 308-309 and cameras 304 is controlled via synchronization signals 322 and 321 , respectively.
- the synchronization signals are provided from a peripheral component interface (“PCI") card 323 coupled to the PCI bus of a personal computer 320.
- PCI peripheral component interface
- An exemplary PCI card is a PCI-6601 manufactured by National Instruments of Austin, Texas.
- the underlying principles of the invention are not limited to any particular mechanism for generating the synchronization signals.
- the synchronization between the light sources and the cameras employed in one embodiment of the invention is illustrated in Figure 4.
- the two synchronization signals 321 , 322 are the same.
- the synchronization signals cycle between 0 to 5 Volts.
- the shutters of the cameras are periodically opened and closed and the light panels are periodically turned off and on, respectively.
- the shutters are closed and the light panels are illuminated.
- the shutters remain closed and the light panels remain illuminated for a period of time 413.
- the shutters are opened and the light panels are turned off.
- the shutters and light panels are left in this state for another period of time 415.
- the process then repeats on the rising edge 417 of the synchronization signals.
- the phosphorescent paint is illuminated with light from the light panels 308-309.
- the second period of time 415 the light is turned off and the cameras capture an image of the glowing phosphorescent paint on the performer. Because the light panels are off during the second period of time 415, the contrast between the phosphorescent paint and the rest of the room is extremely high (i.e., the rest of the room is pitch black), thereby improving the ability of the system to differentiate the various curves painted on the performer's face. In addition, because the light panels are on half of the time, the performer will be able to see around the room during the performance.
- the frequency 416 of the synchronization signals may be set at such a high rate that the performer will not even notice that the light panels are being turned on and off. For example, at a flashing rate of 75 Hz or above, most humans are unable to perceive that a light is flashing and the light appears to be continuously illuminate. In psychophysical parlance, when a high frequency flashing light is perceived by humans to be continuously illuminated, it is said that "fusion" has been achieved.
- the light panels are cycled at 120 Hz; in another embodiment, the light panels are cycled at 140 Hz, both frequencies far above the fusion threshold of any human.
- the underlying principles of the invention are not limited to any particular frequency.
- Figure 6a is an exemplary picture of the performer during the first time period 413 (i.e., when the light panels are illuminated) and Figure 6b shows the illuminated reflective curves captured by the cameras 304 during the second time period 415 (i.e., when the light panels are turned off).
- the phosphorescent paint is charged by the light from the light panels and, as illustrated in Figure 6b, when the light panels are turned off, the only light captured by the cameras is the light emanating from the charged phosphorescent paint.
- the phosphorescent paint is constantly recharged by the strobing of the light panels, and therefore retains its glow throughout the motion capture session.
- the light panels 308, 309 are LED arrays.
- a schematic of an exemplary LED array 501 and associated connection circuitry is illustrated in Figure 5.
- the synchronization signals are applied to the LED array 501 via connector J2-1 illustrated to the left in Figure 5.
- the connectors are RJ-45 connectors.
- the synchronization signal is initially inverted by inverter IC2B and the inverted signal is applied to the base of transistor Q2, causing transistor Q2 to turn on and off in response to the inverted signal. This causes current to flow through resistor R3, thereby causing transistor Q1 to turn on and off. This, in turn, causes the LEDs within the LED array 501 to turn on and off.
- the inverted signal from IC2B is applied to three additional LED arrays as indicated in Figure 5.
- a plurality of additional connectors J1-1 , J1- 2, J1-3, and J1-4 are provided for additional light panels (i.e., the light panels may be daisy-chained together via these connectors) using inverters IC2C, IC2D, IC2E and IC2F for buffering. If daisy-chaining without buffering is desired (e.g. due to critical timing requirements that would be hampered by the IC2 propagation delays), then connector J2-2 can be used.
- the voltage regulaor IC1 used for the LED array (shown at the top of Figure 5) takes a 12V input and produces a 5V regulated output used by IC2.
- transistors Q1 is a MOSFET transistor.
- the underlying principles are not limited to any particular type of circuitry.
- the cameras are configured to capture pictures of the performer's face (e.g., Figure 6a) in addition to capturing the phosphorescent curves (e.g., Figure 6b).
- the pictures of the performer's face may then be used, for example, by animators as a texture map to interpolate between the curves and render and more accurate representation of the performer.
- the signal timing illustrated in Figure 7 represents one such embodiment in which an asymmetric duty cycle is used for the synchronization signal for the cameras (in contrast to the 50% duty cycle shown in Figure 4).
- synchronization signal 2 remains the same as in Figure 4.
- the rising edge 722 of synchronization signal 2 illuminates the light panels; the panels remain on for a first time period 723, turn off in response to the falling edge 724 of synchronization signal 2, and remain off for a second time period 725.
- synchronization signal 1 which is used to control the shutters, has an asymmetric duty cycle.
- the shutters In response to the rising edge 712 of synchronization signal 1 , the shutters are closed. The shutters remain closed for a first period of time 713 and are then opened in response to the falling edge 714 of synchronization signal 1. The shutters remain open for a second period of time 715 and are again closed in response to the rising edge of synchronization signal 1.
- the signals are synchronized so that the rising edge of synchronization signal 1 always coincides with both the rising and the falling edges of synchronization signal 2.
- the cameras capture one lit frame during time period 715 (i.e., when the shutters are open the light panels are illuminated) and capture one "glow frame" during time period 716 (i.e., when the shutters are open and the light panels are off).
- the data processing system 310 shown in Figure 3 separates the lit frames from the glow frames to generate two separate streams of image data, one containing the images of the performer's face and the other containing phosphorescent curve data.
- the glow frames may then be used to generate the mesh 307 of the performer's face and the lit frames may be used, for example, as a reference for animators (e.g., to interpolate between the curves) and/or as a texture map of the performer's face.
- the two separate video sequences may be synchronized and viewed next to one another on a computer or other type of image editing device.
- the sensitivity of the cameras is cycled between lit frames and glow frames. That is, the sensitivity is set to a relatively high level for the glow frames and is then changed to a relatively low level for the lit frames.
- one embodiment of the invention changes the amount of time that the shutters are open between the lit frames and the glow frames.
- Figure 8 illustrates the timing of one such embodiment in which synchronization signal 1 is adjusted to ensure that the cameras will not be overdriven by the lit frames.
- synchronization signal 1 causes the shutter to be closed for a relatively longer period of time than when synchronization signal 2 is not illuminating the light panels.
- synchronization signal 1 is high during time period 853, thereby closing the shutter, and is low during period 855, thereby opening the shutter.
- synchronization signal 1 is high for a relatively short period of time 813 and is low for a relatively longer period of time 815.
- both color and grayscale cameras are used and are synchronized using different synchronization signals.
- color cameras 914-915 are used to capture the lit frames and grayscale cameras 904-905 are used to capture the phosphorescent curves painted on the performer's face.
- grayscale cameras typically have a relatively higher resolution and higher light sensitivity than comparable sensor resolution color cameras, and can therefore capture the phosphorescent curves more precisely.
- color cameras are more well suited to capturing the color and texture of the performer's face.
- grayscale cameras may be adjusted to a relatively higher sensitivity than the color cameras.
- synchronization signals 1A and 1 B are used to control the grayscale and color cameras, respectively.
- synchronization signals 1A and 1 B are 180 degrees out of phase.
- the falling edge 1014 of synchronization signal 1 B occurs at the same time as the rising edge 1024 of synchronization signal 1A, thereby opening the shutters for the color cameras 914, 915 and closing the shutters for the grayscale cameras 904, 905.
- the falling edge 1012 of synchronization signal 1 B occurs at the same time as the falling edge 1022 of synchronization signal 1A, thereby closing the shutters for the color cameras 914, 915 and opening the shutters for the grayscale cameras 904, 905.
- the synchronization signal 2 for the light panels is not illustrated in Figure 10 but, in one embodiment, is the same as it is in Figure 4, turning the light panels on when the color camera shutters are opened and turning the light panels off when the grayscale camera shutters are opened.
- the synchronization signals may require slight delays between respective edges to accommodate delays in the cameras and LED arrays. For example, on some video cameras, there is a slight delay after rising edge 412 of Figure 4 before the camera shutter closes. This can be easily accommodated by delaying signal 322 relative to signal 321. Such delays are typically on the order of less than a millisecond. As such, when the system is started, the timing signals may initially need to be precisely calibrated by observing whether the video cameras 304 are capturing completely black frames and adjusting the timing signals 321 and 322 prior to the actual performance.
- curves are not limited to this implementation.
- one embodiment of the invention uses markers dipped in phosphorescent paint to capture the skeletal motion of the performer using the shutter and light panel synchronization techniques described above (either in lieu of or in addition to the curves on the performer's face, and either in lieu of or in addition to retroreflective markers).
- curves may also be painted on the body and/or clothing of the performer while still complying with the underlying principles of the invention.
- the phosphorescent paint applied to the performer's face is Fantasy F/XT Tube Makeup; Product #: FFX; Color Designation: GL; manufactured by Mehron Inc. of 100 Red Schoolhouse Rd. Chestnut Ridge, NY 10977.
- Basler A311f cameras 304 are used to capture the images of the performer.
- the underlying principles of the invention are not limited to any particular type of phosphorescent paint or camera.
- Embodiments of the invention may include various steps as set forth above.
- the steps may be embodied in machine-executable instructions which cause a general-purpose or special-purpose processor to perform certain steps.
- Various elements which are not relevant to the underlying principles of the invention such as computer memory, hard drive, input devices, have been left out of the figures to avoid obscuring the pertinent aspects of the invention.
- the various functional modules illustrated herein and the associated steps may be performed by specific hardware components that contain hardwired logic for performing the steps, such as an application-specific integrated circuit ("ASIC") or by any combination of programmed computer components and custom hardware components.
- ASIC application-specific integrated circuit
- Elements of the present invention may also be provided as a machine- readable medium for storing the machine-executable instructions.
- the machine-readable medium may include, but is not limited to, flash memory, optical disks, CD-ROMs, DVD ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, propagation media or other type of machine- readable media suitable for storing electronic instructions.
- the present invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
- a remote computer e.g., a server
- a requesting computer e.g., a client
- a communication link e.g., a modem or network connection
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2005800308469A CN101427564B (en) | 2005-03-10 | 2005-09-26 | Apparatus and method for performing motion capture using shutter synchronization |
KR1020077008614A KR101182623B1 (en) | 2005-03-10 | 2005-09-26 | Apparatus and method for performing motion capture using shutter synchronization |
NZ553106A NZ553106A (en) | 2005-03-10 | 2005-09-26 | Performing motion capture using shutter synchronization of cameras and phosphorescent paint applied to a performer |
AU2005329027A AU2005329027B2 (en) | 2005-03-10 | 2005-09-26 | Apparatus and method for performing motion capture using shutter synchronization |
JP2008500696A JP4705156B2 (en) | 2005-03-10 | 2005-09-26 | Apparatus and method for performing motion capture using shutter synchronization |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/077,628 US7605861B2 (en) | 2005-03-10 | 2005-03-10 | Apparatus and method for performing motion capture using shutter synchronization |
US11/077,628 | 2005-03-10 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2006098770A2 true WO2006098770A2 (en) | 2006-09-21 |
WO2006098770A3 WO2006098770A3 (en) | 2007-12-27 |
Family
ID=36475378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2005/034524 WO2006098770A2 (en) | 2005-03-10 | 2005-09-26 | Apparatus and method for performing motion capture using shutter synchronization |
Country Status (18)
Country | Link |
---|---|
US (1) | US7605861B2 (en) |
EP (1) | EP1715454B1 (en) |
JP (1) | JP4705156B2 (en) |
KR (1) | KR101182623B1 (en) |
CN (1) | CN101427564B (en) |
AT (1) | ATE405899T1 (en) |
AU (1) | AU2005329027B2 (en) |
CA (3) | CA3176795A1 (en) |
CY (1) | CY1109533T1 (en) |
DE (1) | DE602005009141D1 (en) |
DK (1) | DK1715454T3 (en) |
ES (1) | ES2313221T3 (en) |
HK (1) | HK1096752A1 (en) |
NZ (1) | NZ553106A (en) |
PL (1) | PL1715454T3 (en) |
PT (1) | PT1715454E (en) |
SI (1) | SI1715454T1 (en) |
WO (1) | WO2006098770A2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010508609A (en) * | 2006-11-01 | 2010-03-18 | ソニー株式会社 | Surface capture in motion pictures |
Families Citing this family (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10075750B2 (en) | 2002-12-10 | 2018-09-11 | Sony Interactive Entertainment America Llc | Porting locally processed media data with low latency to a remote client device via various wireless links |
US10277290B2 (en) | 2004-04-02 | 2019-04-30 | Rearden, Llc | Systems and methods to exploit areas of coherence in wireless systems |
US8654815B1 (en) | 2004-04-02 | 2014-02-18 | Rearden, Llc | System and method for distributed antenna wireless communications |
US9819403B2 (en) | 2004-04-02 | 2017-11-14 | Rearden, Llc | System and method for managing handoff of a client between different distributed-input-distributed-output (DIDO) networks based on detected velocity of the client |
US9826537B2 (en) | 2004-04-02 | 2017-11-21 | Rearden, Llc | System and method for managing inter-cluster handoff of clients which traverse multiple DIDO clusters |
US10425134B2 (en) | 2004-04-02 | 2019-09-24 | Rearden, Llc | System and methods for planned evolution and obsolescence of multiuser spectrum |
US9685997B2 (en) | 2007-08-20 | 2017-06-20 | Rearden, Llc | Systems and methods to enhance spatial diversity in distributed-input distributed-output wireless systems |
US20060055706A1 (en) * | 2004-09-15 | 2006-03-16 | Perlman Stephen G | Apparatus and method for capturing the motion of a performer |
US8194093B2 (en) * | 2004-09-15 | 2012-06-05 | Onlive, Inc. | Apparatus and method for capturing the expression of a performer |
AU2006225115B2 (en) | 2005-03-16 | 2011-10-06 | Lucasfilm Entertainment Company Ltd. | Three- dimensional motion capture |
DE102005020688B4 (en) * | 2005-05-03 | 2008-01-03 | Siemens Ag | mobile phone |
US7606392B2 (en) * | 2005-08-26 | 2009-10-20 | Sony Corporation | Capturing and processing facial motion data |
US7701487B2 (en) * | 2005-08-26 | 2010-04-20 | Sony Corporation | Multicast control of motion capture sequences |
US8780119B2 (en) * | 2005-08-26 | 2014-07-15 | Sony Corporation | Reconstruction render farm used in motion capture |
US20070076096A1 (en) * | 2005-10-04 | 2007-04-05 | Alexander Eugene J | System and method for calibrating a set of imaging devices and calculating 3D coordinates of detected features in a laboratory coordinate system |
US8659668B2 (en) * | 2005-10-07 | 2014-02-25 | Rearden, Llc | Apparatus and method for performing motion capture using a random pattern on capture surfaces |
US7567293B2 (en) * | 2006-06-07 | 2009-07-28 | Onlive, Inc. | System and method for performing motion capture by strobing a fluorescent lamp |
US7548272B2 (en) * | 2006-06-07 | 2009-06-16 | Onlive, Inc. | System and method for performing motion capture using phosphor application techniques |
US7667767B2 (en) * | 2006-06-07 | 2010-02-23 | Onlive, Inc. | System and method for three dimensional capture of stop-motion animated characters |
US7841946B2 (en) | 2006-06-29 | 2010-11-30 | Spawn Labs, Inc. | System for remote game access |
US8021160B2 (en) * | 2006-07-22 | 2011-09-20 | Industrial Technology Research Institute | Learning assessment method and device using a virtual tutor |
US8207963B2 (en) * | 2006-07-31 | 2012-06-26 | Onlive, Inc. | System and method for performing motion capture and image reconstruction |
US20100231692A1 (en) | 2006-07-31 | 2010-09-16 | Onlive, Inc. | System and method for performing motion capture and image reconstruction with transparent makeup |
US20080170750A1 (en) * | 2006-11-01 | 2008-07-17 | Demian Gordon | Segment tracking in motion picture |
US8542236B2 (en) * | 2007-01-16 | 2013-09-24 | Lucasfilm Entertainment Company Ltd. | Generating animation libraries |
US8199152B2 (en) * | 2007-01-16 | 2012-06-12 | Lucasfilm Entertainment Company Ltd. | Combining multiple session content for animation libraries |
US8130225B2 (en) | 2007-01-16 | 2012-03-06 | Lucasfilm Entertainment Company Ltd. | Using animation libraries for object identification |
US8437514B2 (en) * | 2007-10-02 | 2013-05-07 | Microsoft Corporation | Cartoon face generation |
US8144153B1 (en) | 2007-11-20 | 2012-03-27 | Lucasfilm Entertainment Company Ltd. | Model production for animation libraries |
US8831379B2 (en) * | 2008-04-04 | 2014-09-09 | Microsoft Corporation | Cartoon personalization |
US9142024B2 (en) * | 2008-12-31 | 2015-09-22 | Lucasfilm Entertainment Company Ltd. | Visual and physical motion sensing for three-dimensional motion capture |
TWI415032B (en) * | 2009-10-30 | 2013-11-11 | Univ Nat Chiao Tung | Object tracking method |
US9508176B2 (en) | 2011-11-18 | 2016-11-29 | Lucasfilm Entertainment Company Ltd. | Path and speed based character control |
US9224248B2 (en) * | 2012-07-12 | 2015-12-29 | Ulsee Inc. | Method of virtual makeup achieved by facial tracking |
US8786767B2 (en) * | 2012-11-02 | 2014-07-22 | Microsoft Corporation | Rapid synchronized lighting and shuttering |
US11189917B2 (en) | 2014-04-16 | 2021-11-30 | Rearden, Llc | Systems and methods for distributing radioheads |
US8998719B1 (en) | 2012-12-14 | 2015-04-07 | Elbo, Inc. | Network-enabled game controller |
US20140198185A1 (en) * | 2013-01-17 | 2014-07-17 | Cyberoptics Corporation | Multi-camera sensor for three-dimensional imaging of a circuit board |
US10488535B2 (en) | 2013-03-12 | 2019-11-26 | Rearden, Llc | Apparatus and method for capturing still images and video using diffraction coded imaging techniques |
US9973246B2 (en) | 2013-03-12 | 2018-05-15 | Rearden, Llc | Systems and methods for exploiting inter-cell multiplexing gain in wireless cellular systems via distributed input distributed output technology |
US9923657B2 (en) | 2013-03-12 | 2018-03-20 | Rearden, Llc | Systems and methods for exploiting inter-cell multiplexing gain in wireless cellular systems via distributed input distributed output technology |
RU2767777C2 (en) | 2013-03-15 | 2022-03-21 | Риарден, Ллк | Systems and methods of radio frequency calibration using the principle of reciprocity of channels in wireless communication with distributed input - distributed output |
US10126252B2 (en) | 2013-04-29 | 2018-11-13 | Cyberoptics Corporation | Enhanced illumination control for three-dimensional imaging |
WO2016030305A1 (en) * | 2014-08-29 | 2016-03-03 | Thomson Licensing | Method and device for registering an image to a model |
CN107423733A (en) * | 2017-09-06 | 2017-12-01 | 成都豪宇韬鹰科技有限公司 | Motion capture system based on mark point identification |
CN107390448A (en) * | 2017-09-06 | 2017-11-24 | 成都豪宇韬鹰科技有限公司 | A kind of active optical motion capture system |
KR102435406B1 (en) * | 2020-07-03 | 2022-08-24 | 현대모비스 주식회사 | Asynchronous control system in camera built-in lamp and method thereof |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6592465B2 (en) * | 2001-08-02 | 2003-07-15 | Acushnet Company | Method and apparatus for monitoring objects in flight |
US6633294B1 (en) * | 2000-03-09 | 2003-10-14 | Seth Rosenthal | Method and apparatus for using captured high density motion for animation |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3699856A (en) | 1970-04-01 | 1972-10-24 | Whittaker Corp | Movement monitoring apparatus |
US4389670A (en) * | 1981-12-30 | 1983-06-21 | The United States Of America As Represented By The United States Department Of Energy | Electronic method for autofluorography of macromolecules on two-D matrices |
US4417791A (en) | 1982-08-19 | 1983-11-29 | Jonathan Erland | Process for composite photography |
JP3036756B2 (en) * | 1989-06-12 | 2000-04-24 | 日本電気株式会社 | Oscillation circuit |
US5235416A (en) * | 1991-07-30 | 1993-08-10 | The Government Of The United States Of America As Represented By The Secretary Of The Department Of Health & Human Services | System and method for preforming simultaneous bilateral measurements on a subject in motion |
US5903388A (en) | 1992-06-11 | 1999-05-11 | Sedlmayr Steven R | High efficiency electromagnetic beam projector and systems and method for implementation thereof |
US5304809A (en) * | 1992-09-15 | 1994-04-19 | Luxtron Corporation | Luminescent decay time measurements by use of a CCD camera |
US5519826A (en) | 1994-04-29 | 1996-05-21 | Atari Games Corporation | Stop motion animation system |
US5480341A (en) | 1994-10-21 | 1996-01-02 | Strottman International, Inc. | Educational skeleton toy with outer shell |
US5569317A (en) * | 1994-12-22 | 1996-10-29 | Pitney Bowes Inc. | Fluorescent and phosphorescent tagged ink for indicia |
US6020892A (en) * | 1995-04-17 | 2000-02-01 | Dillon; Kelly | Process for producing and controlling animated facial representations |
US5852672A (en) | 1995-07-10 | 1998-12-22 | The Regents Of The University Of California | Image system for three dimensional, 360 DEGREE, time sequence surface mapping of moving objects |
JP3745802B2 (en) | 1995-10-13 | 2006-02-15 | 株式会社日立製作所 | Image generation / display device |
JPH105229A (en) * | 1996-01-12 | 1998-01-13 | Egawa:Kk | Tomographic image equipment |
CN1192249C (en) * | 1996-06-19 | 2005-03-09 | 松下电工株式会社 | Automatic-tracing lighting equipment, lighting controller and tracing apparatus |
JP3873401B2 (en) | 1996-11-19 | 2007-01-24 | コニカミノルタセンシング株式会社 | 3D measurement system |
GB9626825D0 (en) * | 1996-12-24 | 1997-02-12 | Crampton Stephen J | Avatar kiosk |
DE59803158D1 (en) | 1998-03-07 | 2002-03-28 | Claussen Claus F | METHOD AND DEVICE FOR EVALUATING A MOTION PATTERN |
CA2372376C (en) | 1998-04-29 | 2007-11-20 | Carnegie Mellon University | Apparatus and method of monitoring a subject's eyes using two different wavelengths of light |
US6149719A (en) * | 1998-10-28 | 2000-11-21 | Hewlett-Packard Company | Light sensitive invisible ink compositions and methods for using the same |
US7483049B2 (en) * | 1998-11-20 | 2009-01-27 | Aman James A | Optimizations for live event, real-time, 3D object tracking |
US6554706B2 (en) | 2000-05-31 | 2003-04-29 | Gerard Jounghyun Kim | Methods and apparatus of displaying and evaluating motion data in a motion game apparatus |
US6850872B1 (en) | 2000-08-30 | 2005-02-01 | Microsoft Corporation | Facial image processing methods and systems |
US6950104B1 (en) * | 2000-08-30 | 2005-09-27 | Microsoft Corporation | Methods and systems for animating facial features, and methods and systems for expression transformation |
JP2003106812A (en) * | 2001-06-21 | 2003-04-09 | Sega Corp | Image information processing method, system and program utilizing the method |
US7333113B2 (en) * | 2003-03-13 | 2008-02-19 | Sony Corporation | Mobile motion capture cameras |
US7218320B2 (en) * | 2003-03-13 | 2007-05-15 | Sony Corporation | System and method for capturing facial and body motion |
US7068277B2 (en) * | 2003-03-13 | 2006-06-27 | Sony Corporation | System and method for animating a digital facial model |
US7358972B2 (en) * | 2003-05-01 | 2008-04-15 | Sony Corporation | System and method for capturing facial and body motion |
US7369681B2 (en) * | 2003-09-18 | 2008-05-06 | Pitney Bowes Inc. | System and method for tracking positions of objects in space, time as well as tracking their textual evolution |
US7777199B2 (en) * | 2004-09-17 | 2010-08-17 | Wichita State University | System and method for capturing image sequences at ultra-high framing rates |
CN101237817B (en) | 2004-12-28 | 2013-01-23 | 超级医药成像有限公司 | Hyperspectral/multispectral imaging in determination, assessment and monitoring of systemic physiology and shock |
US8330823B2 (en) * | 2006-11-01 | 2012-12-11 | Sony Corporation | Capturing surface in motion picture |
-
2005
- 2005-03-10 US US11/077,628 patent/US7605861B2/en active Active
- 2005-09-12 ES ES05108358T patent/ES2313221T3/en active Active
- 2005-09-12 PL PL05108358T patent/PL1715454T3/en unknown
- 2005-09-12 EP EP05108358A patent/EP1715454B1/en active Active
- 2005-09-12 DE DE602005009141T patent/DE602005009141D1/en active Active
- 2005-09-12 AT AT05108358T patent/ATE405899T1/en active
- 2005-09-12 SI SI200530475T patent/SI1715454T1/en unknown
- 2005-09-12 PT PT05108358T patent/PT1715454E/en unknown
- 2005-09-12 DK DK05108358T patent/DK1715454T3/en active
- 2005-09-13 CA CA3176795A patent/CA3176795A1/en active Pending
- 2005-09-13 CA CA2973956A patent/CA2973956C/en active Active
- 2005-09-13 CA CA2519737A patent/CA2519737C/en active Active
- 2005-09-26 KR KR1020077008614A patent/KR101182623B1/en active IP Right Grant
- 2005-09-26 AU AU2005329027A patent/AU2005329027B2/en active Active
- 2005-09-26 WO PCT/US2005/034524 patent/WO2006098770A2/en active Application Filing
- 2005-09-26 CN CN2005800308469A patent/CN101427564B/en active Active
- 2005-09-26 NZ NZ553106A patent/NZ553106A/en unknown
- 2005-09-26 JP JP2008500696A patent/JP4705156B2/en active Active
-
2007
- 2007-03-08 HK HK07102597A patent/HK1096752A1/en unknown
-
2008
- 2008-11-10 CY CY20081101274T patent/CY1109533T1/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6633294B1 (en) * | 2000-03-09 | 2003-10-14 | Seth Rosenthal | Method and apparatus for using captured high density motion for animation |
US6592465B2 (en) * | 2001-08-02 | 2003-07-15 | Acushnet Company | Method and apparatus for monitoring objects in flight |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010508609A (en) * | 2006-11-01 | 2010-03-18 | ソニー株式会社 | Surface capture in motion pictures |
US8330823B2 (en) | 2006-11-01 | 2012-12-11 | Sony Corporation | Capturing surface in motion picture |
KR101342257B1 (en) * | 2006-11-01 | 2013-12-16 | 소니 픽쳐스 엔터테인먼트, 인크. | Capturing surface in motion picture |
Also Published As
Publication number | Publication date |
---|---|
DE602005009141D1 (en) | 2008-10-02 |
EP1715454A2 (en) | 2006-10-25 |
ATE405899T1 (en) | 2008-09-15 |
CA2519737C (en) | 2017-09-05 |
CA3176795A1 (en) | 2006-09-10 |
KR20070119603A (en) | 2007-12-20 |
ES2313221T3 (en) | 2009-03-01 |
CA2519737A1 (en) | 2006-09-10 |
KR101182623B1 (en) | 2012-09-14 |
PL1715454T3 (en) | 2009-04-30 |
HK1096752A1 (en) | 2007-06-08 |
US20060203096A1 (en) | 2006-09-14 |
PT1715454E (en) | 2008-11-28 |
AU2005329027B2 (en) | 2010-09-09 |
JP4705156B2 (en) | 2011-06-22 |
DK1715454T3 (en) | 2008-12-15 |
EP1715454B1 (en) | 2008-08-20 |
AU2005329027A1 (en) | 2006-09-21 |
SI1715454T1 (en) | 2009-02-28 |
NZ553106A (en) | 2011-07-29 |
CN101427564A (en) | 2009-05-06 |
CN101427564B (en) | 2011-06-15 |
US7605861B2 (en) | 2009-10-20 |
WO2006098770A3 (en) | 2007-12-27 |
CA2973956A1 (en) | 2006-09-10 |
CY1109533T1 (en) | 2014-08-13 |
CA2973956C (en) | 2023-01-17 |
EP1715454A3 (en) | 2006-12-13 |
JP2008533810A (en) | 2008-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2005329027B2 (en) | Apparatus and method for performing motion capture using shutter synchronization | |
US11024072B2 (en) | Apparatus and method for performing motion capture using a random pattern on capture surfaces | |
Levin | Computer vision for artists and designers: pedagogic tools and techniques for novice programmers | |
CN101694385A (en) | Small target detection instrument based on Fourier optics and detection method thereof | |
US4405920A (en) | Enhancing the perceptibility of barely perceptible images | |
US20110025685A1 (en) | Combined geometric and shape from shading capture | |
US11860098B1 (en) | Method and device for three-dimensional object scanning with invisible markers | |
WO2023279286A1 (en) | Method and system for auto-labeling dvs frames | |
Lee et al. | Preliminary Study on Data Augmentation Methods for Sign Language Recognition | |
Osterloh et al. | On Human Perception of 3D Images | |
Levin | Extension 3: Vision | |
Schoo | Animating Highly Constrained Deformable Head/Face Models Using Motion Capture | |
Mertsching | Region-Based Artificial Visual Attention in Space and Time |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005329027 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 553106 Country of ref document: NZ |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1528/DELNP/2007 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 2005329027 Country of ref document: AU Date of ref document: 20050926 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2005329027 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580030846.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008500696 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077008614 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05801031 Country of ref document: EP Kind code of ref document: A2 |